var/home/core/zuul-output/0000755000175000017500000000000015153346702014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153373204015474 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000472044515153373065020276 0ustar corecore5ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ*|vi.߷;U/;Yw?.y7W޾n^8/ixK|1Ool_~yyiw|zxV^֯Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.JwAtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m GƂ%k1羨(zv:U!2`cV, lNdV5m$/KFS#0gLwNO6¨h}'XvوPkWn}/7d*1q* c0.$\+XND]P*84[߷Q뽃J޸8iD WPC49 *#LC ءzCwS%'m'3ܚ|otoʉ!9:PZ"ρ5M^kVځIX%G^{;+Fi7Z(ZN~;MM/u2}ݼPݫedKAd#[ BeMP6" YǨ 0vyv?7R F"}8&q]ows!Z!C4g*8n]rMQ ;N>Sr??Ӽ]\+hSQזL c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?+'h=TԫeVިO? )-1 8/%\hC(:=4< ,RmDRWfRoUJy ŗ-ܲ(4k%הrΒ]rύW -e]hx&gs7,6BxzxօoFMA['҉F=NGD4sTq1HPld=Q,DQ IJipqc2*;/!~x]y7D7@u邗`unn_ư-a9t_/.9tTo]r8-X{TMYtt =0AMUk}G9^UA,;Tt,"Dxl DfA\w; &`Ͱ٢x'H/jh7hM=~ ֟y[dI~fHIqC۶1Ik\)3 5Ķ']?SؠC"j_6Ÿ9؎]TTjm\D^x6ANbC ]tVUKe$,\ܺI `Qز@UӬ@B {~6caR!=A>\+܁<lW Gϸ}^w'̅dk  C 7fbU{3Se[s %'!?xL 2ڲ]>i+m^CM&WTj7ȗE!NC6P}H`k(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?ݽb+ uV4}rdM$ѢIA$;~Lvigu+]NC5ÿ nNჶT@~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pC?fE?~fjBwU&'ᚡilRї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj.~27)p|Vn60֭l$4԰vg`i{ 6uwŇctyX{>GXg&[ņzP8_ "J~7+0_t[%XU͍ &dtO:odtRWon%*44JٵK+Woc.F3 %N%FF"HH"\$ۤ_5UWd̡bh塘ZRI&{3TUFp/:4TƳ5[۲yzz+ 4D.Ճ`!TnPFp':.4dMFN=/5ܙz,4kA<:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ G׌MU.APf\M*t*vw]xo{:l[n=`smFQµtxz _d2J0BLzv8D<%P\MUfN$68X8ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$+mj(^>c/"ɭex^k$# $V :]PGszyPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`xyҐ)Hnxps2wApP!se]I)^ k?'k:%Ѹ)?wɧ6a{r7%]_Ϧi~ԞnZhubW*IakVC-(>Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7`D'}Blϊ%6ECPՙeZzFAY Y*2OJg^F k=^!r ":#iI L5\KZ<)em|$oB"!ڽ@%Hd^D+**PƷq{WNR Y4[vNDeߖ|ʒk̡"#uTP"탕.Oj}XBfKe=cJ_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@t޹na4p9/B@Dvܫs;/f֚Znϻ-MHVuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->'I`ػ߶]rTޏnv{Ӵ"FnZ=%%ٱ$ƍڃ P$ Eu%!HcQןX&o5զHe1V%ɖx=͏̏U r7nǪ H` Q *xGyb+mlCcUR\f)V1*VuQVTqVܞ$~}T.o"GGQ"h-Ń2R,:8i ƪ׼H~cUzJ(y̪0Luģ'wAO=BOBӥ6>g?Jg*Qi^G/iTB( xO4}{0ή*hp^ {:%eXxJJ/FtJJ/N0^_i\ƗlW׸2Z{UnXBY\Yp^Wp_7|wŕai eqP׼p&t[탡fmN!?4iM7,ŸbQ{@V<+^h9q_qG7/Af|٪ui=TJ#u 6A3 wx2ow'UYdA׶c7)ǹd O7_n^(tC O ;Mmb|g۞#)4 ^ǧ*pքMTSLk3Ua9h|4Ui_*-x*7(uM!Crn+ίl:UݺP ݼP ջ0j VA_* ` =`MϿۺ؎%HrmV_c9޼1>xuPur,H~o+QKl g#d![QgΰՅ"wGѨL/aσ'X]5XruޠQncRu1?M^X\ϣt@%"2j:gb?Qiem5,R+yW>BqlkXԸ&BneY2_8!<5y,NzS qDQ\Z-:"f%heg3?JiyU<YribC~`Z`y -Bk?b6(ʚ54ȳy"H'ȁ)V?` (V1)YeؤZI\@aM\:?wʖzE2 1YTU^ be9kV [bu^EHu6 bZLZ`xSAZIlHb%/?K? Y{;̂lUfa5 Q dK(!uU%D;o?Jxfj*JkUv6GE_P =6 -#+tr>ͱ oxR,Y%ǟOHG_ 1jl|oZ_?=r0guM69xL %C 2K "z`A#:&+VrFOhUZsПwm܅6^aőGPM=U/S@;˰@U!Xr,40`PGQ]ee+D@/>8ސ0xc8ievOi{#<Ѽ HP-ۏ!}ea&՝a/'_9 ia_]Ń7_9 ;tZ[ޤ,q㪻)KVʹ hϭ}מlhx,sg<-0RMfbJʟ-Df NYMt4v(QEsv^dŗjGߩOz)[ͻMTnkOϗOS0Glml|iAN`!?~xCrțO[4x*U ""C,0ewᜁefIQ%y,hA fNA"Nm:}yrU4(eփNEU"%X\-_d|M`Wʗ"ɍ}8Ep|gycK:ǙgQtT$m6р."<" ?]KY1 G5JRi*!+G>Y7>PVUJmȚ-[͛;X#6~5Mϲ\𺈪u49<)&>Jfnzi:no ]Ɋp4kXW,ˬػL^q|dF4(CgWM&A-;95OP M@m^amaϟ`AqS@"Nv.m+ڻHp.+vC uJWà,1 ]`e:e;m DcjObIym-N_u`c%T'3u!`iuS3S=aT] ,3'uT)@LF7[0d6q<6xJ+< 0`'itazFX(ZȢ-AXռ++([76d")XEFGFYx):\R{R\ cd-Y A\ނE ]KXb(WؒGip[Ã=vShmq-p4*#<&mmWpB,#5SkȊC/U{e 8>gT% duyģ$[njVs,B.h4u'8ڢK jM׊.zM1J|$"sؓV3CXSMk<jGjZFO] re.H$\J0 t:2iR"Z1O ˕ѐ٪!%VQQ,9HJT`%ok@Yшݝ7m[  Y_!懚Gjؖ9/><).۲!hpu2u["APyTݩSڎكhs|}b1s[Vrx\Y6ndsoid$;lMHC 3϶ս2|1`pH۽ZOG5 ʅ(콺=6~jxu ! k` osKAa*?pq,r~ҟ^}<1ԎG!$o?fssnaG8~73 r"V4 l7lG(:N4x}ӯ>{P>0A /8pܭ8,mO!} X!㼻J\I|y 5c%Bqc 7@}@}6!Hh]"GAcb̏bxLfQъ8B5jc9m x ȑF>ZhmBL60b{5! iG,,=:7$C'8͈FuU0KCB>dcXK;].momM(}'˨HrqTW$>\MʒoXa$t}66Hi !4l2p{"!L;,*T4(aM=),/!& Sy߶?{ܶ 3ג>@;qk'4ER6I%%?֓ Y+t6M#΋UD&췍*E-xUV9hIP$@k'^UW_>||Wq.Wlc8]:trI?2uM&k2M>q]xG|Vc>M }0kWٯߚOeV~}Jq4.3սc$[H=|UR~=%]8D~!mx-ulo]ѾK&ѭܧf_gJm1LYɳee[y|'䯏<8y'_8!i[SjV P<⅃`7 ?A[TnwEC0#9b?S"NqnvHqx 39w gp @:v}u xO vC1 rlH&b|mS@dk'& 0b<-9#3?xٗ$&ϢDa,yՌٝQƲG]DW\'cN׳F7 P:cɘkZ52IX^N9miqʌ(О#nxܙF`;@x^ ۱Ex](a'3X`в? }NhTDIy"]0t0;2CXNb2ͮqX`*}yqo( E '%1r􅐺}<$o`/^=?bRS{1EŚ!d$tZq-RX Vj [u9Ϫh0`=XeI.x胢 zeYlEc2Of0l>wF{ RVgG F/vѕ|wzX"6`)75̴9M"@lPT0P,Cg3 ₥Ǯ}s$5@ ;zJnnӁ|t w4{R7tc ÄeD4\Dg {wz- TM8$l/ v/!4xtդC|? hSbB'W{+' Y~rMP"NT_edI>OJ٨"C8sd# 5ȥq)+rzoojZnְZ}.'{(?nUuRQժ½ \ j!Vt?͝܇4lZz5mu(JG0˫y /Vr.dw\Ϧ~gÓOdӴ(^E X?`H_!:UXO.o+^:OQpx#) ?gEKnuQ3ExZnnت䏷8oMGEt'al NC-<Ы?-.lD G?`.q .:x`n596O%Q¿? 4p"ؒ0"۽է(dGO;v7>=S(F<=+Tg׎!jڃ_o2[#il.硻)L(Mldpz6{ʮ^'7xK8=ӆ .]'`j'y]eBˠ\9\4EUG94hPnzЅWB^1H n!J nO(}tGB6RM:>WJUҵO۝;wuVu ٞVi:;RlAJO#ݑPw BUB-'{ގz[oA=w$߂PJ(ۂP=i e[lAh= v$4؂PJ(߂P=i [nAh= w$4@.qcԎ%Piq=%{V/Bi t\Jx>-p6 t&(Ѯ:Dfr/Ŝ$ZpO>B*M#] K2;"q5.}z$Dʸb4}};l{Ιyq>U]xN^#22..nQ(uv =aP6σa,9v4bҫ6gy>6f`ԐJ fe PEڷPs?I,ϓ2/2AiQ j/ E`tA{UGu㻆4DUfY"JTD3pj)\DZ:0h>D31ԖU|*)gɜ c`(p"ҵ",|+gl.JH @I6H{'LRhyÇ3\߭޲G Y^U??YBV_DcM)iBK i}7(''l/Ǫ)4TE᡻uŭOPݷ=tTI\p{֬HEoZt1oHf h˱ .IV: [opo"1h_CΆ[fuY,>8nFUYe:n˪e<ㅠXĊԐ`>vߖq =8f8'o{rg}SflWt* % Ӹ`~&F m|ٳ`k#=)ݿAjIUXegStK`E)|"?XAz/!=XXMLKEzջ|!k &.7 \8|s&"-0{Uo\ϔda,hqχD>B/F#@V~1jѵ{`в Lq%ST+b֨\ GPb8 3 f (#i/,n')v̗Z<睼Whk~yׇ3E>O4੏Gz edYOur|je6-]tE1qyD[jGd$=afa%qE*6'"o9G݁ՉsY&!ѷ(9o+g~Qyc?}REگa/9諬n~=m z/NP<9$CxUbxs#=r.(?Ow*F$3f;i%fa5ق([(T>ThO=r`Ob(΄|%*:ڀ̷w8)_byt%:mK}!ACDW|ԚK%} DJh}>t6)9$pq]P&ݔC,a&vɢ(BBhI$ff_Xbp"G/x/|_*A1p3@UO(ƒBI|+G]Y6.b:.GѐD"ia< #BT ذ _bWi>/ M=1l6(4~ΧiR^+_#a9eT$h܍+X/E{ \h]{gH0E~;,&b #Uj3 ߠ1_g! |еu8;az{*9D׈ E9{9zβ2Ϫzl@(fiKŵ E>S-& 3{D)"ڡ^*=YcK_~:"%JvZGpcAMw&'|#絖lgغ?}@nECqk~.y|PR|:O Ѐ n?O$ O&A_]Oz><l߬jxiQ_bHA)3;=LOotK2#S}l70re%޵+{v/:-x<0 j:lcVm%ۂI~y Jʟ>+"ʟ^[`Ry#AHVW!1!A1 }I#n!6Uy j%%#D,L"F}a6z~D8\-$^yR}Iwby% qBVZ )KgTaMn9ۄ,=iƏfVy&N ?_kUBz%m5\ "AWC;9]$J;DUv}">' X=T9VUpVαg sIO8b8Bbo|I78^_ˀփ,2Hn FʾLpĘ8yMUp5բ,T]-2F,Go%@kex\%hQp+6'߯LreIClN E2r8~DV>_1\+x(v B5c%r9  y1.k;4][+=OJOB3 b`X B☊'04Fisp0:YY oObHp0B 4W V"4T+\W"F݂$CJUMKp{j~.ؠEFƈTlud;&:+%ZNYځ/ 0p D}IgyHp4ϲ֢}j YoH+*5=NW jԮ4\ CJDN1H%W`/EtpW;j<>c@^W"a7sU]52 ]Haİ[w$ n8Eg,T=hQ,![}{/_ pm!tP fVg5 p16ƵuL^-S>KTC9`$]?o?85VyX!b q%(U6pSBX!Ff;xhnI ӐM7{ZC Z7hYZ]Ue[Ba$nVU[CboG/5":]{$8nL؄ÒBf{ Q= A 1RL/lZQ )L!+/43.j9r0M͵9S^0pG6e}_s \^!8T 'DzI,?,XZ{:zը&[7 xԣFN\;u|qTS'VBjoo;jf.˔ձSʢfSm"GN}"zkǷ5}(b#>I&A̋\CU4؎>3bˁ7k!S]|S.[-9V$Ec7 ^xjza#)Pt|tO|k7{歓 ZB2VT$bgUB8dٓB[Ry#E~"q0JV UVCo(IBl%KG rCcԛ6#8^b#܇`QXVVb&95xL5'z\X4 EȅO3 AY{sZo+Gu*r>r=K ̶d+F)OJlֈڧUc_rTj?O%í켺6آZ6nEw>CҒՃ?e+D1Ȫ:BnH"ZY5!M &D9OS&mp:̫mdH0=g|i4o7Ot <Jsin+`ʑ7fep[<b>iMa_ qA0U^ɅfAOpNYqcRĘp jٳZ &{!q\Tm^uV%pfc#>r y|SPݓ49cbvP }ZlSfKbV5ApYfȫElbk_-;?0O@?||E%\#ۏ?EؼT-2(W"Z-KEh;V|-Ǜ*}>5!u Nk>]\r> BYuLJa?DZ>YV.BHb*ř-M4]AJ3+V ,bJLz^;ÏڲB~"9O:#nVqDW^k r^%-%$؅wMJ hx)cR<xH:]E?8ډrEGԊ ]XRl /M9"vd0|o=8+\H\aH~[E/dcl< o{\XzzKyY5LY,噅+.1;G0bȃc+oS7Ն 84fpS@^b$`i{evJ`nYo-/Wp(>$&È!g" UB`"gVClaZaCmtȡa]f Y=ZK`Zb8*"Alp} !taİ;.wLћ~BmJDA~e*J*i ǒ''m_Xf 2K:c`%M!0][;*Ĭ0Y"onnrz^.$cx蕻F'Y,̮ы5*sB`Cr:3Kۥ[kKrg%DU*V޽ `crm p axP lZ.2 s]ϊP4aӠ YiSεS>iQMLdՌ#$]ؕݝa lL68P krk)dQ )d; !-m~ ̋дs#9#jGR z,)W3@(c-]b*q1UZBbBZwybκfpF* G9ڞ 9-Nҳ<+ ~&q0;&#ޕqdۿ0yɌvq T"5mؤ[-I&֒[fSݪ9C`?_q5µ1d]vQf<lk3aRxt~0(J$d,Ylnͺ~{&z~{V'h@}[ ûv"^Z{z.{xO\S6?p<$2v'zbxӾJ@K%1㱱'_ $@L)Fe䢠IK`T~P<=KN^;2I{u#um~=8?ތL4W-Y3ik6V6w#EyydWlnWd8i&s4{zk"jXNkOdۋ7?LH^FH|:[S6{GY4$ͺ3 ;^Xbn^1k砙 A֯D|943&Xup5.N>d lf#QCE (f('8`B/%gcda칸mr4g+? d.E7KoYoɥS{\a懗Gܛpu/pa¤ 7uw!z'~Jbp2{~.\C:I003].M~ߖ7k|\üsr0AcPvc4~D2>'6oY/r2OA'q0O&TcD=Vh|nE'*Q1\ܿ41vBIV~0c{yK~80m/fNh`=|\z7*J.*_1-v2EGh+7W> q N ¦Ơ4ѾlGA')Iwײ5Y3=NwfP3n ~ xNCͨLp]-ƫZGܶMkޚ+qMYjKrP Y\H2}]qJқy7nLby k@K7tj2KTcSφA@Rm,v۝|* *\Uꪞ=iVρ33Z&A1 )Nz0]`_gZ h!Y)O&/ }~M#A1 ,y6A8(f8Ӻ+nrܛ!G p\ W&yBJEx]FYwTas3@Ȝ#]#SQÛAGhe>q>Iw|Rt7YLW/\~l:0q6|J.iCT]G(PG?x)%(ӋY=3ͿOWwe[<8.ww8|}sed;Y7*sv֟3]T~OMohe 3+Z?Bܩ:_!'f v"ѣ A)Bc!Gf )>zQt-w8b|ށ;ГhAԋL-;4rbAheZi(0zWp]:Ic]h^myVq:']PS ?C4RɾJdEΰxF%E=!QI8F_p0kD1ste ^+%Z}֜mʚgͣR%k*JJ}q福dr>z8k=̈QfGdf0E57tf`$ ) Y,秿 eG=` &n`F.u[~e!8r־֝w>hyS>9'3AߛOkXq"O';Gqn IHDUZkqr|SQ}Mrw 1{Z0E2G%/ VhUfET=U0*u Z|3T14U6 #*XJ+}:.j<6# b{-?:A`jD$".' ZYU693,;ĵY#U<~?..[ԙ6 [33V9S'J`cj+)ZA\7CtS ɢ5?#3G-D.G!Ɗn7z!mqmg{W,%^yFrixJMf!E.hv3e V+ϳh>h8R>%ԉTaBǂ~f.|)!TE~L6{öh sk6&p$v`M7-F_"Hkm%{,6|y:(uAVLYh+:f*-xB*[Uj-G+Y!PhyȽ+^~K>XU~='OTpD%<2SRƣnfGd7vg?EUb2Y-"9;u.54XF'2&NUVycB'0!ӅvTOfEk2;=@Eg\Y~iIw@3x}]kƢsi0px')RM ( pBoWͽx%/M;jM;f`AF3HAxpz ,hv+ =5џz084|)n%l&#pom((Ua6ۓgEE }Udd8I *q- Pdws4Aon>s%>+vK=un("﷉^;INu0 o:U%P\Qu.Y=8<\&yW3>i8>5_|5Y4iThl}P\EsQ`4f)kKf\YȦF@*Iɚ\7G 4s sޒ X01o ͳQƤvoKxI5\~ðEg? |,g\t9nN_tMF!cOb7^RZIEޗ&Z&FH4bRr aG]K >zf7:6c^S3{>R}f'y[0M,$O[̗弔^5\U҇KhgR^b<I zkd5}߉m[E#6Z$+([5@o| ky}`v}U(Y)bs-f%}m`"d}ėWgS(/ )ᩔ:ol0>CPJ5Bb0APfrA;A L/"I&/li,'KjGg/$f<`>jKcq%qWV#*cKJ pe#ܣ ,U^O:aH)I$ TMX@#~ *% Gȧ#16f9rՀ2@Re46qPkQMۼ@9k0p;0KsVk6ʦ!=g0Rxt͚""[UԝȍQl:cY3)E,fXdWXdpHZn*YE)V ֊IP,PLAknLCH%W"`xke_qޮ˜ƈ56t ^oVQԺEAVPJ QmyA$.T5, X!Rv)e3bJ L CʈYA@y߬n j('28-(Ϋ++Ў,)-9&KIdJ* MȬ\dFIpc h .g o W]qa$-B q#KiHmHuďHP(#+¤SSjݎkCZR >2:ȵ'w$B`*#cTRe M>H>5@SoHw "bٕfZM\ ŝѣ{ĄS͕D[cF7D[m"+d| bpԄ0&>6@A2O/8'Q< ?SQl0km#9e6"Q~bd1l/jJgJdHJ!5|HE @%rجW^R.$$)>|JY}Z@XHNX 'je)Efg+3,6Ff%HJYg(UߖBV~ۤ*DYV)4Ic"L$ogiJ0BFSFv3TBX:nH"tnpŲ˖y39ϊD4c9QqdާŨ)hޕJDj b: pCfZHEc[SM6cV6Kdʔ=ͩ!&K8bqΨYbOE/xu_ dkm&cBu $5:…T"a?HɡL`8SIWIcFA dfI)=v: hj4boCO+3>#h%f|! ½`8lɜ(_"$1a0Ee^_( @F(Ԙl_~2ΘlayVAQJA x8xCqYPҀ*װ\3r Ӡc+}5gWgyU?K7Pp3>=R" +(0c2bŕ-ƀ>bj"YO`EZ|{f!'wųo'?@|vbq}cdxNyT4lwkvg8]8g*?&4}Jy}dSa'ʍ^}23iq2z11t=D./{AߓCΖBnϔ.G/FzU 4/k] ["^>aOlL>U8* RnEXQ.[RVq-`}S$'UhxԞ5hO%+ݜYorja\jXJݧL)@}Wر(/ܚ3%Kײx Tş_igOWehNfǣ8e7Q0u(Q6;ՔhqN: `a[ .GdOr})$rg cwk mCQڙw;T CH|L cV]' Zݼ5nVyMK*E'RۿV(Oj{;6𪕐v>\]Z +Y.Ul'm5U Ì󰦁{L[\#LڮL S֍'FK' k7-ySǎ0'>2BO|ySE<ףwףu_;(C΁ _/]:aفu!D߿r.ggn<2vG+w</,>VN)_ތ.ƣK{P=u'WtfxpkA3ihMCHR'hǴ=Q7G;NOhVCt̽!,OXG땚@3[OWOTvʧS-_>svzG=kk{/_zeޟ8npk #&y嗃Gc ]\0ِm=<Çۚ8lMv|8-} /_}ciC/Bsט`{9=n0~(Hv'Yeܒ5U̚2];uE4;[߈[?ESOFlsU˃ȃkQ{lj~Po8n<'`)fG!֮BUaKZ9t7#|O45"}Dq5}E>[̑lUOR?Xs%:`ε`]eC H0n gy8#lfh=i{bJ=e{{]yk-6}J)~krsKf]1B|~ };F{|60uW$h$ ϐBԲtӴf?P]!v5di  9M8 T )Htj$QZ4Zk49' +DLː@%!mǯ^6PKGLEگG'e]^%-{Am\kRZV >|Fdcn;b=}s<i%M|̭۽37N?(=EDԮ3O@U_V8a|kdxPyQYd]Q;ތ+?@d{Ժo+k%KqYGB7{l/tazP47S9эo%8j&sXAd$2|;qp-$}b,~ˇXhCL3z ̡ʇM|Pc4PjSsMw%>4UU+lqf$GӐ㛾ǹ,Gzf!V+K6V\j]Um³f-W3WLh\y@Um{Ms}r*΃㴡=ݡ0+=ك>=.ۃs bDe'6L2.D3@O1%y oqu}aJl_ۮ_*/6Vw5:a;\BJ8U/ -U?qj7zH+M2.v/^Lac ̏7p8$pgV"pg5pJ@}-vDz<;V h))Vriɥ}%/utaocC[ۃFeZGSvN0M(TCG{JjS#FloG{JH&v@ KގK{%!l8%TP\1ގI{qP-6Tp`jO v7@ګtfLHs@ Uv7Tk8]x HVAUUhoN9݄R@凃R@Ӑ-B dJC4!y>e5|.KQu.,dP-w,ʠFd-)W=׿>FKP^M߇Du:1t"޴XW\q `#5XﴡZƫUA&š^#7`hDgЦT2-CHs{n[  ^F+rmmxf\DǩPu#m u}g9LF*v~Lnf91t4u=!WaW{]l%6 l }Hz]7InQڗ^#cAVZ ¾0NUFaC"EGxİ۟6Fl8 ΊDn6N夺3#̞+Ryd(9KӢ{0‮Ľ"Y1L:gXAl(k^13t&bA+pb{F)D2V$!Z[Ɗ7{ O֨V} '^#>Tlj¦-U-!hed!8u XEM*>fqWJ* 0v2IÑ~# 1\I*O*Z%b?-)hQ7@B~#wc IvI+ZjJZqmЯ"&FɋFȖ9D%r=r7f8aZTJ8SST΃5z&FfeuԊiV1k?٭/CpVz@s$H QI!M nhGֈ{""D$-@VJ2zI@tU.Hh`/©*]]ppT7Z?KV+]'!$GgK%[XeZGhuaɰ,A%Fk)X*ܪ d KXX>FRgOrY;`GX8y˹OͲU:y]My/h#UG ;qS[obSѮ}lqY>aԓ'mlԓl',;hY\U8&ƠSJfe5|tYًv kd4llS]je݁*vrݬ)P+\FFC.ۗuUa_ϕ0yKbG݉~e_-|K(h _/;O/^T,FKAY/M%>LYubJ{GoIw׌dfUzv@“.%B ͛qj2G@ >,>OL<\ pJwUmQBCрB.LǙKm-ڔq3ec `>^^Nw QVU^k*o.UyVeÀvMDjT kt!/Aa CA.FF+o ڶ,V+:=r5 YJ!Tj<شG*UVlD5An}8gojXt'o仅իUXL?*\y{~Ō;:5Oo_:aȝo,^.B&%DM~կ&NnsgoOKZ"s'ZHym}&8ѽ-G|u<LI })Ĥ$(@&>C)=8:V9~q %<zfN/2?/zNN覕f]{Fl]/KeTdwg4sGqwX+52az^I2&@0XZ-P(6Snʕc-{T(u0lZ!gO}D0]8  <@٢?4m^|g1;m҆cwF[OY1x&|6FTig +eSaɩ02ހjT!>5e,~cN zB4{eUgE@ʨ,h5ے]E--ٶwxQ)Sl[<44H;z1%3ad40F9]JQ*#/ -bڡ3Y52h9r%ZAUk*,{ jޫu( ^#hS&Dµu٦d@xOWQɋWUv`eP|Y52njJ46&%&zxYƖ͉o6SkZ@D|q~3 5,u xBʖm! ds"IJL!!k!cJqj $i @u i."qs酌w(2ΓE^#AR)VǷ\ɹl]>\9=joq8< D˔%tDT1=,Ξ,+#+ws2+F qK` Mb_: -Rw} J{怢h@G@*_X@e ,w񲹚+^ʤ8 AMzUiuGPrr.G>y$%.;'=1*KHmM#^)s3+rQ ճ3 [Z6HZP3ad4rʈwB5Tܡ}Cױ,Z(/UK ,#W\,K*@} %B^̋[xrP02P"C4joXґ%jd| "P{DDIyш\"O:g%DЦQB)$M(jQOg xm#]A Cq5:baQɂJ@E:D6mKC%0oN'GYAHy:\ mU:)g{H*pLGi-) ӦK.>Zo?1" C㾑} {]4^j-hI[-Aܝ-+^M/Æ-~Mt/WWcNg|~u}DhC'Zjֽ8G8z*}n_3xv~'$.Bē~8OI'!QarFwz|:sbnug 1VH?RWDSY3vg0vX򖍟6 o-c_gA3Rv(Yla{6~}V1=n+ѽ\V7Xvn=/.?,fR Vh{*)9VY4QU]em;Pʽ8}=>o#x/Oex<>#!.ߓ<&gjUߟߖ6x>?7a * £ӣ~<]|A'34ռ::~<=i,.h| ɗY,fߛim55kN*vht\7A 2SeZG+̣0Pڡ!Xg RTD K`㝃mͮgg.[ ;Sd&+fR\wACT:ԟ&W1ξԟX]:N"6eі 'kBUN]lyAzR0Td.zPiM8 `VYe~(cDuUYDl^6QrufHԩ_\q_&m= H,$l+eեUed*qyehm@e2JL:_*/eإ ʰa-z`iW|k}L@or;Obh?]?OWJhe*hSnjڼFOj8<)5崉onۓ.^|{闓FEO'񲪥{ @nɃ&|.֋TΞLG+ohťt֤޾i;JK-(ݷ{F*mC_Lf:oN,Oj.?Oi-< RI-~!~H=̐~ˣ56a12.`-|[׎لٙ nTT zk>Z4tN(h;,4&Bt;?Nߝ_]OwݸMJUIϠ[?ͪĤ»jS9Ovu[/b.lB}?ޅ鈿g~RfW#X%2^~w/]Y\r~UꦾK+?$O=Y6fznоF6ߏUT>dUr(lY*U}||Grv?75?/.fhz1i-{(Z[ރ"7s'%j~)1"c:|)Ŭ Li@s4 vr\|sc/Lֶa-&}4z%tx*wb)T[ڸ# =kAm*M)kxonu$& |nx["6= X'}VjzZ9Ҍ|V ~7BRb>rG; _|4z umOG$u(TPߵp7fI}4z̀RGI|ZF-7nJwkxcasJ4Aa3n^_6У;=rj=jzZ# XePY~?m\mF sMPml66]Oj ڍ"q={U;G|!T"w$-9-06CYA4#O&OLc~uvgAzEq*UIt99;bgc:_L~>?ڿb jP۲ F~󶷕>ޕ_ߔ鎅>߼]yӨC}}R _㼧PL0Ҽ8ԻӴL<=NqN\+بxZiWM]Fhj RIw[ zEnsV#^mhE5s̩ĨJLuBVɞu]KKg WĘKgMK״x`zb-DeTf`cSap:|Cffb ޳:i= x-LFi9AkUïVb=' y4B`H?5'GF/F3.5Y^Ģs~y)|eEw N/޷5#Iťͫ~Ifo?1Nc&j]Ϸ!e)ffڥS`^(NЋ6"EfF<3> ]H0!n ܑ}A=Hɧl0KvoN0awE'DLH=k#F8-{S jv$C]b%pPHtGt ܍?":`ӿah36YK~#s"L[RO5Hl4q ,>Cc\~ A Y@=nPv_򋽨g!B>STڠTy{I@'/f+ucLMse $e,Yt^OwJQ_4ti>R M;^)T{4WUj?ZwP;{st ,|%MG7˟^ v݈xyG$cgUs}rcu䴖 WW5M[]BB24ô<_8]|tɖ#km{l[cqwͻxqOLGuO5oyø'qe \Q]TZB`nFGeLŁCxgd6#μN!ghhG6ӳ_:f0 .{!m eKR'Njb!@G=I|dOnͧb I:8\O[j-m7|_TC4u1;Z#c̶Ұ"%-{Hr<)gzm Ɖ6K.rO M(!H…bX>>';n7 +Ah~a?N5tLI\XB0>d H[+/|a)^B b#ZYW7 +.њ}Pa zB 4sf$g /;NmB#9T9(&]*E mb3#(*2WT6ҩ̥ό`P-{6I+39dC!RR\FY=&]{-͡I1\PN̓B 17hPfIA&LxC(*x-6ČU)kJtxi4ڬUd~l.U=ep H(ǜ|>|#QZZhyt+mC.Lm?1&J֮Wܰ"s1Jq9J 9ˊprۤW;KWNg9tsIzz8:@|߄!T]e:Dx5pԂ%RBC\m RY,3\ H@r,ɍ`aO\si\#YɒGP Vx'j^|!\ =c1F ةd9E DGm#/&eDxAv0YrA DP+GZ%+M@la xLqtvI,e0:- \ vj9HU4͌*!#S5`ː˃*t, Rsc Ca[1lk~u.VJXCVK()J+.@&lP`&dhmqn`ȠdH>3y)*LlW,r,UqfL6dA -Sv Lˑx͠j`oȻl!!~( g !L㭲pH . V".h)U,Y|5a@A7aeK8)e&FW2)(ؙV&m<[ lp+ V_@JN L6"M8b !f,;lBK/uJ z*R#H5ٕ0u)"h5ڽSR<Ǥ.`y!X}YBB2H q7RDA Eb`=h, $ʚ0,ctD֊bs6V Lr[ env%d:bVCp"p1z(]6,Vjs|9}&8\TZ!WNV7Jl.xb,)kx YՁ/mHw TG%%'^b=TJWX`f ;:@/Hr TA"D$iI"+C`H dPfYoS?P{2  ƒZ,-*x[X]dXdufLt*# G+RmF6brPI`1FۺpY#|,d+cCX ,C4 1w`6p22hTDw!2.؂>B@E51 I`BF @n3# ځgr@ȃ4s^OB6hRVVf)gT@BASrSTyN::(]koc+p?4ip|H %Ò .ɲ=ْ̄3sHn67xd &0֣`:{u 3KbՅҠ 0\@bQKp;Hjr8={a*3@`gAB DtIȨL()*?pTWHڳ\9gYc?׀,J(6p2%YHWoeEQ6ton%LL8Zy1 d_%i QB&cŐJ9Zy65+kdq>9ʿ Ov>Mꚙx%0p3ʠ * L{&bJI`-Y aVnw:%z$7h䑛s_l i4]"[paR"DNPw!!# I?f I{rg.""XH) !@ɣ BK0A=#`F uT r+ G$S&\*$'g DNb~[E^k"D_K1`6`q2"fץ@gAl$DiΫ\|ղ]w2YmV3&`j?ܬzՍev7fVaoQv%!#f>v٠;Q0l#g2iD, ֳgu}˻R,7؍f1^_Űџݿp6tsj _?^p]Am8}=/Ig䄸9fK~-y4EݿF~zG^ \K:V$ԁwNuf6 ԑ.# u0Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]j:vLS4H;*\nASwRcм uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP ubSuv:T3B6zB uBltNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtΫ<Vw57;Ak~Fטs/t2V< wk,7>Y >],ᕫ|R@V˰ Vn,\>h#`A8o,}'K[cFBn^XgT#`93H ``%#fsxrc=Z^gyt">򋅏g')//鴓i}`Mp #uu%)n}o}vLWJ1CNJ#Aص7ׅKۂ< Fd3#18g~A4շ덀dx\|cǫ_'4"7,lf)K9BE'Y?iM ?c_h-chTuXy?.G_ӕo>mގ75ao<Ͽp4㧚?~3v):_o/\/yaJh&M{ >hmr.Fi,و9S4;uVGi,-Ei-j+JWR(W6dl[r(=lˏbF'X#`9'ǨUke#`%ȇVjn)| 5KZcT}F#>] k?f$X.DeS[`ܞmQV}2(uTzċQur9ԋ:/&4;۩Hױjńr]`msʝxsʝZN)wAkϛj/&=9n-buB"a-nv0y34xXZtgcy(~,F_Yur hVH`8q󨷏?b"ßr}?~Npo|U4~7ӭ[}%`;l^++Y[]㍀8F AZ1)$X%D;fP0v>)|W?o;j,gg7Sb}fFKS zNpM$oтm{o=o{y*2 t+`М-JmZR+`QQ#`0F |׊HaUY .%iD^:Ϫ-c&9p#`I #u#`T#`E=WRс#`kE;^7X-ҳFeM#`4d`irU`>|wt4着; L-`w,=np<.hCSyޙ`XKVKqX X8VzV)Y+=0R5(3@(~}=XG4)X%XsFҜ+ml+`2VJmRVzV+Nւ=5I[Y̘-ZKZ(NuK7V}xmVC[`HXr`kfܖ`13 sۃ%#_o%X+=+6vފHk,ݾ?OX9^ڪruerLwͯ/.]z}2-Ѿ[QVr@1N8o,3t 2muH{Y҇ XllwȒ ;VCٝJ[77V!TVm!._PdEjyϛ/e|eӼ>hԡKvSvKOS;5[_e&pp9&jbooPot>||6/;y-oƫĵhUh]?~mG?]]7usut9\j:_fWjOɏW˭\$/Î+8tr<5ϮP m/vgsoKg7uEʖ-Z?/WhڠV؆e -ލCJyx]nz,g/&F/Y.օwϏ嶾_U Epɤ%Y-lwZ؏?j%>x^&qvpWV;:Lhٚx3Rbu6}0>Uo kgpeۙ+#yGr-48Y‰;ρ)# #Sdeϥ-6zQE/+J<*zұ%k7&Cm]T}]M).5/"cNNf[niXp^ LYtPLRR*+_m $K.BB0b؏2>c͉d1,iSMM 4Rkeo#1%ira Ɯ;ro<2)^CFYbʽVx79a2,FI1ьL!QީMz#ZtE.䮆gSPMTR1.)1#/*2)T6ҩLgƊs0-s)oaKOAHZa2c `EE7"D^J `tdeJ 3mg/EY:w|INLɤZ=p($2hP^1!5`s `af6M~ֆ1ИUJT5%bNNj3K5k94ؼFj qL}~ʋs&3kģhi /Wچ ]Y૱Ę(ZFTgU$r(R(Es,#DO j)xV+猢5uVҐkHw:;"؎'6;Aߣ9R"X!T8OLb$e`PY̕3uH$ 9FOI5.F%-E0 "Lrf3`4i6e.耽,1*!#:2Xτ]Ra$μA Fc"Z%+M@la xL$2ep c4<Bi73tL^XlUSJ<Ʉը ( y<;*Yr8ͨ&SuYNB; [XݏoIdނ`YɘWƆlc`r \{#,G|tɌN^N NB-e\ =bohT,&(#,܃siP6:=*Z Du W*vw + hNuGPXhrТ2PaζSAGⴜ [XdүiI^2i9( ::Ȁ-jw7Ԋ'p9qGXTv ì | p,HPcm9]2*5aY9!TPg.SQ PE_7kJrNJabUhd{\j] lcȎY4x_BECPbnH҈5&WTey"!(d "T`j?[J#wf t Ƚ*Du53{r۷k`"t7g2#k䆶wnr\ Ŭ7-'-cFْ^ϻH<_/mWeozߏ&Sn:bܼ82+F'eԇx 8qKn!8~K"@`-/qu0!:JuPo@ӫDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJԩDJy)a{:X{1Dk(NV/STuDS:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:_,QG(R D/#-'2^:DeS:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:_ QA.{{K'hYc{˟bi2kwQJkӟ˽Lf< d. =BI Ee ^rn/Da"0LK,BEwL_Ha a)h"5JZz!:{hrTaifY 9}<ݎn]j|`𽝴!DGW} <\5 4}Vj.^s9OW<N9?IJݗ9O9uq+bĕ\JZ/=\AJKe Wp%J ,ZZN\pQrƬA'Gi7o__wLW2k}tֲ{ %gPG~7I"ӿiտ^/_m|K~{aHּ\o"?HRF2a<n]w5i;A'Ҟ_k H}=M8Y. \% ` L;y14o\vbUR9r'J93Q˸K2CЮ!L&ͻs.@ m;z9Y˨'&'7e;_Mw d: lB<C=|;߶7y\: `޽}\ifS}jCY/!rѬ}X={<ӆžG s#:<,H{=ǟ.Եkg2;bSbW;Jj%Ϥgc&ũ7˶3>H֎z1OEKseý CNΟz[kq7Gs:= oz, )d&"#u$HKkd{_#_^Dh?J%5u(ϑ F.$;2m ?<)/XZ=/W2g2/v ҙгbybE&ucJt-)? 8<%)v2:Z3KR'=+i4.䑻V #γ:ucٙr{ij49Պ;v|% QA'1ꙻ39y_]JIsSNU<]ͷncY+N `}|$-=Jzy)X4'\2臣Μ0,c0h0YKG7_Zz^ѱ|fdv%4%]jGrVwi~GlY5奨^^3f~ym]ևPIcz讟V/4#%RREB*1{coߏRoq'^Aלf5|/ Is^xvq؟ gswx}q;W=x^?W//Í̓1`oulr{ꝓkmi%>˟p٫0am[7û`9QngX6k -!w%#v}|-壟RrWtݿ pqno&oW.[r'y›ddYQLƅNxd>4(GY\W<>ϻs.?h@ySyh8#6,&Oa~[N :Mlۢale²v.7[ XݕI"k"SJ) 1YŠ>u&N;wߖ9 1`tM&Oz)G%IVK"A< Kʘ^H>BȨ[fkXCY4-!@p !HIMy ~,Y Ͷk8d=03nLAqHdoIi2L52SHɍvv=pJs/%ɍ[`[Gcv}fS\5X'}&Ө)̀Ai~7;dy8,?;Z ި鵣'agkm;A c6'J Py@P<uNgKfC.,yZS!CmU8U9fl~󀜮as,-gfCR;[m=g[CtAF78Wߟ}3`zf:RQp{hü!rI2ßFi0Cv̱v6oY/aY`nXK'HQ/ws ǥ츐>ť,+Θ7 &ƾbTnInOYr48-EfX}{N%9}p岡5#2V#򆣇Fq~089qʪODG*\uTvx|cs6gݧypˀg3@; _ >o1?_wGugK e'_rgœRy+m=etpkd= mߠ# ґs, u0tj4ݝ6n/%osLuKb[ZOn Ϙh]FNЩ~ urwд?kcCdgmFWw m`o/ `WFjig<#tSX=6Klڷ0'*v>%c9e9 1iw|.|'{tӯYj%+UJY&^sm쒧 w?< "ܻS0}#a&xI NkuUp<=cp8ԎWř$|4um^5XL%8Kւgtt!ggtY|vrz<cWK=MʼnIVvdsrtsm s.D%x#ZFߧ\3IvBaTX~0]{p΂4s8rGNNiu8?QXҖcL@*ʸHLIð%{X }[;3T3|9)yLjżSv*Ϙn_q tlQIb1ܛ1yݡb9N *(MA-gZq@q#ҹ۝60~8Em"z=qtrPe<(gI74@J1fcN3!DZ >]uk!^~l{S|줕9&C %;ND Ϙ%~> #dEDeX/}7wOBڵgecʍ?T՟Kŷ+|}Ut?pPZ_}Ab<ڂ3?g 9p+6[U槲3&ڿs4s ?ߌ}f)^|4i8~%[kZWGeӠxZ PH܉U}U5ys<.~#umZ&([b>MU /6 G>n|$3 rf00JbS6xYo~+ym%\}S A`p4>xw$mV@^F0w A#skXOu-uP=*~Sr٤~454ռM*YΧ:K3ٷ?|_;:̠mM[N?;\'x>]Lu2csBtaZY1 \=xX w>lA)kOj|caJ0^#~FXu^-qKѾѕ9wpG%R87>ػY0x0Om>=u*%!3cI/߬Qq,],Ne6-n|3e(V$`4(`<[u֯滏>YZm)Ji|20Prя |^ I%_OT8Qݻ<4ߏ ŭCgӗޱ\%Աt.6jkv \q.m%u#^ shP ޒK}7\RPWp*FԞ2.Vr,Ou cG\p1IQNfe)噦:ǠٔmE;Rk0꽮+GUot(J HBگkrƙ؎N+{ rTtez"ѯHC6U,L;WOtz]> Fv|7 Lu!tX2ĩGk} l^@v- j;DᰬB;u@Gu7Gs3I~%~7`_O駌qRi5 g>O4 VMm1V5iJ<]2P0/|Y_(t]<3a]D4 ]hbW^]JA,ky` 4 ?`q.θ@u.*]Ntʭ/'6m*L[cH GR]w6h-zSX{B>=3ICQHm[isMbyfݰ)E2C=[=ދjtV"޺RTh"a@ ;,8wZ:L "}=UjUh72\b@y5RTq8GN:~tJh0q_D˝1#~ɞ!{[K`n]mjLzVő1$RՊ=Lt2xΰ DS?RG1/WPh-o<\}&/EڂWP UFwQL9.6ur#;r4+k_&FkdpKTɀlNKu*.ww LU .@'?.s|&#}>ddzTw`10JҜ7Y̰,w H41rc1W* $4ܦL?j~*o,Lp@I%% ǮC0f*b9hu=vd^T *eQKe])ׁ-ǘ`U1o%*(S0l _ô7Hff0ӐcP91n'eF /B @|?2q?dc3*Q4@F}* S;kGtGdx>52ϱJ 5U59އ;pvр< ?o}~sڠܧ*q ; ?`k_ !>w3%qZEs`^L(&-*-Z븁o̷+oc _!{o]t^rW,E[ B &uqhF|ިGBW`tgPz=2 e]T+j`|ߵ_TcAKY*\̦:0xGIwxɢ`-"HmNR;&s͔Xf|j%V91F>Z k~,%Ӥt) mR2$sviZ RGmQ9@X.* ܃u Bg!;Q RUn3"g e )hhG|FVJW)q "*ŠҌ1h[M1HO(N*+Dƥ~I%Ii@EʀZzsݡ|چnTKqֿ$DW$d_2=Tg(%X˹*E+6۸VF?ݗl{Ls`ZC,$n1\DmE. '?s MÁװ@Wz2d2f@TiRhx!q@NqӶi1ЀF* UH{CPK.S~Cħm!DbX:F/=ZM $75.# B)bJjp֮A͈NͨG`z2vHxv(­0*%RJPjN&Ў1y\cAޣJ:ϊLƟ(@r ,uͰ39'JSwmhrm3@r36u&͛@jYR%ٱ iItb3%X.vYݥ6DMڸ]w%+F%8 toĂ, И/[80(q?&h٫dI98qPr$.%n+SI?.R{&MI7 Zg\JfRβ("CKKk *]_h>Ǎ8xd$Pb6u9X$PJqXHzR:C (3-c"xhϔW %ȟ*<( <*_Aw9\i[*C B̜wU"݅ [N)3S˼MAJKչF\ 5{wgxskagY[!\"LPFxTAQ;\D}ZfBôyQ R0_z! FQ9*pيl"p-T:pCiU9F]]E.yH1]p*D"½w7Y5%}`.']* OQ'V&6$@]Qc|aOse=QIG.WD+M@tsQعJnSFTb"biXcY%u5:VlE!&GYq"Y8} 6۝Ok%C*#ctC`$@,{(7pqIrivf6 6mWgUBeɷ>nt=<tĚو[ȵk|/#NO@!ѯ8kOqLz1_QOܹcoon&/W?$ߍNzy"b~Fg7v9F>jjnF>ydfP o6+:7 t>A]d<}r;'&M2Xq INSf¥:mbhtm Z d3sMBj~m#B6LH)#VKG~ע䣿}zi6_O/AiA0P[G@^9%sQiэJN9+lUm2yD 9_xo܅[O,ۀìvy|cyyОdMӆ5Ex.&UaP\}z8Ŷ]kT:y {Yf'gW-g-Wm[8L/ݘmr~1M'ŕ~^>^o/}vgK6-oiǟ />qZg-pWͣW >x*j,{l{Sqk_/^oq9YmW/&o/&+/4?r<шG&x?+N.ͷO.Yq-RQ4EO?ɗS3T|qjo-BAlK=]ͥ-~*G- sZ72z+]R{7s/]]w_a>_(d+'@|zb|F_lV\ArQ )=e=Cد7ETų%hwOެa^e]j?whBEWͤ| &5իY=(W/;|E}ݬ&$k-܇r\dCQvr||98.7dˤ϶-NX,,oOU=vdNHz6yd} g"cZh#F: 8_#0""9FAjx=5ܦZҀ}X. 622WGe""hMD"O`^(Xݗi\lr\:K"Na(b4+ }GUQ E (rcckaw ${(èt9{yD#.g/^-inEUp[M6oیl>H#!YPhC4;V/׫M-.Zp ThX-payBjVH> %@-g0 >.lɣ!hcp]EE]CGPVZhSQ@@;蹅htDu. Tw $tFb\J2 <2`GpTA XE Hz|jSmMk, NՇ8 X%<؞mZXtaG9MYjE= GB.+񰍨C5wJ!Ssht\~nUy0BG%؜ (4*%Iº1hR9YijR7\Mk!-jwN'IP-,+"X\И,Ǥ\T*nxGh]TGy+Pƅ˴!\Ku5y{59.iģ*\,0=\#-\e׊̷p6|05/UE5TLt o%Gï%Mm a&=(t*d.'5NCs;Ϙ{xh+a|쉔5p&.Tm);G!GRKtH2Ô3.T8w#iЭ 5iX?4[҉nۡ06' jPglboOeEjp_DʩEx ѕ9gE+0'`qm4'NQ*o[r[4-EОlO]I"\y#|GռDTmTIf0,eXC̹H<GJ45{D5PW&#c"Mg '6KrBE$Yzkɣ(AШWw9켅Q <|HÌjjDrL;Q:uT7D[&rE2.6%Yx $"YNKT$" a*FbqP8Θhsx!YvsZTQJku޲4X<]Ii+ *d83㚿;=kaLc¿ <*cAnLw9=✙OjDe--G=`$-[[ٙ]]Wv1XIb}up暹jc+r qû\AWQZ~pZQ{tx%_ϞH ƀN&2ns@Ш "{ 楳 ERE !+JͽTD IXjdze!y]!+إefҠg2̱3~z%WTXN d\'fTMT,\ wǩx`npZ RK:+ݜdVrlqv"?,$gibA$G4+55NgoI:7dYJLsBq&RW)4]Eʝ׆?P0넃m)(2"MdIGVH.$p=2D>"N%.3Yw1ӸS *rLL1.yJӵ!ԁWO@\TwPPû NJ Q @+{\EVõF7^R0ox(a72Źf+0x ger2McR-yTGW+"@rA[%[}UY0I]m7+z"5I8nn2IZ3Wn^[YTOl`*pQYf Y½W" ՊLfBFm$o/C~ľ)m9玲ygܐѨzLG)C`TF!ͨɼ$nFFtzëg.,cܫ a 03 Y,:(ta-Wy|<eszu:~@gq9 e;ee*Ihڟ |f"52x wțry> Ft}!L>~Z_nB@sIc/V:g)'Ғ>ca}΄Oq,')lfYs<2ϋ5$0 EZ%+iwqfv8/g7?9Dfg˪Ag񎊚S\9wsZoOPZd)FOVTa9C3iW^5X>1,7r2XLsW0L"1a6^~pvO@[_Oea8fo Xnտ?~*~>Y ?OжH< ./KuE1!tnyq={ݎ_U-Bɇ·ˁ5>n ϻH\q8Hr}bz/^m3,|}GF}gZ;,ƕ^6f}VZitRs˼ݽ/}97G6~_\MMIrx=ɦQI9&'т^ eJLLdQ)չg.5i>0kiJ= Qk*ul1N-Njh/?ʸd(LJIHOw~<(.xO0L`4tr5 ³?-Vc|?<aM?oaκ7Il>j ƴ"oB@W?zo%e#TpC6rf`ꛬvT vg;@~6{rXK͇p]N͍x~s|Up1!*:dh;@-8x7Q &;͠1#ӻ~Pff&TM+˛67lf *e9\_S$X-5M[CCJAM7ASaGj拠6W'9B HY:Nj%9Uםv Q6DG/H9edvӢQ4tmF-׋%bEhQd:K/N!2LS!f BEΘzO#Z k%i%K)aMf| T6ҥ'p %p5ҳux[ηl+^=~G w~t :52{ƾ*ǫCܞ Caf33 tMd +=׾sRJ6Nvٻw`YፎX&0ӌ8n ILFdgzPAau_a-1;ͷ͠QqzA (OBd:Q<_y"!4%4)#L-|(.6~\+7p^P7oWRlA z Lx&겱ݺυ1inPkQDcZLs# bN]qr 5އk_kfu,;dRE'hKCY ]O&6Rsq'~[do%fKA.Υ6O ~-(bŸ k|gGuOvOohM """ZqDyHTuQ:M 39Jrig݋2r{ hM """ZQl"#˒:Q 1Z8ٖV^"U668%Nj8Ǵ2~Br/L^),7+"U Mb |2Ǣ-%n+0]mD(aぇ*TZ/׽JziVC~1.f6 773WHCD- }xSڸpn&kͤ o3-8juCKRFRMu5Yчm; ;qvDW;%lI:5aOP EyÕyb`L]vf+~F3\8_a,+1]c:}-YԶq|M~mha}F {MҰO}&=ev -(Ne/Gt8u NTi/`ZoCs)@ɢm@""WTYKn*rmboRDB(۳ӺCPnitpT":NhnH!jFS,0&෇#7MFy7{01Z0QĝJOKiDbzҶq@.:UuMAN]txg^8۔DͭǞk˰ψ @mI<47vNɠRM)̲zFW2⃛RkΫR?P48(<y<8$?ظxwWfeҷUus&"-|IOh%rSwR><Պ:_K4Nry{S?w'E,\}0c&z:2^8KALAp\`;a/ETPht8Mr‰B̍p3y tc>b65t8t8k>k1.b>}:5]xc%hɡ'@rZ|؋ЌR>{9`W*1fcK!''QEo* ߠZے´e$7RSΠך63SkTBo|:UrE jE C|P*AXhQ<_X!|9'1a? s ia gRE<ڟG%#(XI)D_jx6aCMV 6b\"lXɉ ㎨pp)#tRx\\n_i{!Z ?HN| /%.(6bE+Ma"` I((eNh?_Q/VVk+bJKLOaZ~QӢ{~lf{-6B`#mLהys5uis9a#9AlwΧlz7B#ZSG_/%O4n+#s: zBL.A6VB,ca2/:'" ǰ<={9Wr*:V3|붎$,d[HblIݹ[Qo !+ Ÿ-g ]y3g5BV/wg@姜p+V6E( 9Ɍ@g1d -(O2\]zkj j5Cf&ًx w 2Tr|x;:1|;Sw$٣nk~?xu7q5~ }5uzxʻkB:c]a"b$UB #攑k>^AsO"iRBیHy[} -_^Aݰ(6\Oxx(''\90*Ɉ aT¥oI!OCs2dD,­$/h3H /Yd aZ\ma{WORNiDtT~ʶiVƚ:z~1.4r"֎C΁[_;Ÿ {g;!\68Q atFHA+${+8L-p(Nq8 >/E&ϻNowS`~|@&\vRȕ&5}]t}@gcbZ2,ۢ i'B{{RßOYVb\Dwx:u\b2yNDIsU*UUNlOOH<GB[Hb4AXy1zXMp"Ǭ# Q+gmF4.~,/ILmr 6ɶ˒_[EY5qpUŮ"b5pdg@)_gXmn9; 6zd^R3UQ۵M[ulI8wb9yׁBukrA=7c_JSJOWKnI64=ḎBb $[^ɜB6}}T聣_qT> & X~0\e\};}԰-Үe|8(k": ѽ@x8w s$YA`_X Tygq8WVG48R,?e Ⱦge 5es0(aTaֹb'_fwP„uВhOUj&hgRIƽ@lm3jrF/)qvpFUc(Q+^ `7!:Yv %[!25K%7aҋ{VާѽWXcBqX VwG~;r>)# ky<<0:N A0!}Xn^@w.Ou:fR*<'9C\K9wB'E3+Rʎ?jP؇aݬA1p 2m݋M}Qbk? U'Inp+~Éc+azÐht_kp}oTٵM >O]+4|]%^772I V;N#m 9ǎHZBf 6CT!*?ꧠJsb}C݅>_Zpr ݍGh #_} !c YU۲~S#'T#/2<.a_L&\l$rP<}2E][ws_+.̍$'[41_jna{x\{ ?XXNZΑ-0E9.xh t&&9~6ܨ6 7enϠ=+-2Կe-$3pscS2L-8VOjW\`V shbât/{]ꯦ o.hƻR%/!>䫵Bq=d'Q ֨g:7,(>MW0ڝWYIFYd#~}gG:\ŽYA܇{o0P&6135EIٝv !-rCAVey^PfwP[_5O#M͵ ITq? WpFe\oU?݇{0ޏ+1  FqYXֿ inpa0<{Nܼڪ~_Q|~p73/V~n7>_%_O;ýt2wlPaPd2,_[ϳI7h3m'^JӾR:{6zi?n~}OQA0׀|x{/0]X* +PH1URj,=3Xn<tQbkn0XgL̦3 .N>̷y+? ,};0|l/a87!Q ~{x8ه⃭2T|m r#?Fv͏X쏚qaR/0ٺK⪈+hj1(#repLvQVm%|09zW;3f@.-HXF[Wv4l߾./j_8ǠpIoZp}l^DCb4hdW%< / mm^9jjx2?&f<͚]L6Oy]ĔU8񰭮﫫KB(. W>[z7(zKlMQ",1ӕtG:%TǶ1MTB3fdШ9 iE8-ڼ)6ڼ,м\]yFT mz>b#GGPijڤ O%v1):=ڂ\"U;sR9-8>ѓ'mdĽRR㇛R>L$},w` V"V*ڢ@_!b+X2cbxW-f$`?.7>Yiqǽr5bx3]LkS| sb_tЁy] wYϑw5L9s&]VQ vt@YapOV'G{X)y[p ͍uiF mS7eυܚ CzϠ eG10A<7KuRhajj&$E~PWTp*sdD7E\"9'ޛt%EIxʌf)IIOWU,LQ2w;3PyYٯ?];O0_)1PsvFoc;?ͤ J'b9g ;LR57rTC[t+&ϙ >dcU=$gͶN^[ ?@2nbZМ^Wwsbe5ZBb;R͑llkRJXDTk9mN.g.ι٨~u̾%ÑX 7|G<nFu$kXj]N"[$'ث-9rNKk~leV~\0'\4~-{hu9ޒXZb62K48_ƊPO>Y|ݳv홭-w6~]Sy<y%E6(<l%UwvY-~oK0t72,ǻIY1 ke?f:.S -0E>_/?S8afk={S&0z}+\e[kR`1;fUW6ֱ2OS\a aʧ!C!Z`cl q?ϑ$@Y |c,"F .AD\+[=}δeݨ,>(faD` y4x-J8.3)M~KB0 S{95Ȇ asnDhd.Թ hTPn &(YZBnV|l.=V X%J/#8l썲 eA[`0 AJ 4X8 G-8x*oOptLOHcG܃K +™~ǡk6~MZ wLs8Q9xT:*?O0l}$K`ڀuii1")\C햎.H2:{>q8}A )GS8~>WbvU(@\EQAGi A[ptLCdͣ%CZ-14=6RB3C"SHd _#-aWm.^ V5%vLu|?OH(qr|@gp{s>"c1tCRCD59~/KbWTS&P>Sbd(! +!d7V4zfN}r$-sTHiɜj0Ĉaa)0jvw*=3'3s9pU+n?H8II"P~#p# ޡ8B~k?' }MړlP9쭛s}ق9N1=3S΀S |]N Qh%[D ]7e !5 {HP# &7( R`vF!~C 9o"(7cH裖4Q3sw{3dj/O q e`FQ@>x5$ Fh{ +6bNvg;.I};{fH0|v"}x ,%35c" LD[ pjxT8l;k3sDSlv|b{ 4=/bXNKI0`&eB0:K  !bZ8f(i욌b|$|epݏgtCl;-A,!Le'_h1U9Z^E4@Xh8(L`(l8qh̜N7s@y|z~H6΍I@w D41({"Gw s q@pP8A<,x 4@o]f*[sʿ Q0aEKb&~6a=ő#h 4zfN7W*)@g戰sJ6)2.'h̜wpY£.vu޹Wft'/9 I _uHOaD@j `wS/s(~SuaZLi"=^ArvF̡ݏ鹸[]69\ub'zhM"Бddn<ɸ}\WoK~l!q 0pnw. 4eG&7Iw 59ܲJ,~J{R -aFX ko%^ҁrmu:gN0v )8Qr:e t :Nun{LYU:Ju\cy h2(sE T ݺh̜ Ӥa68_'S9x \> .DX`1Nw{&M_HjL _^KaK/ߣr/};%5d&=1S3JI4>X}, S|7?~e;r<3iVR M ¦k ]\N*:DftiaLNV>r?ۏu?6grek鲡}C`Jw=YOty MXΟO5=y^"h:os|@f6TqR~}"HZ+~ =.$i/0Uk%8"`E` !,D&D1y]ݜm|R)6ؼIJzw9ˋWx4/'p&i?% A+ȻZd`׾b>.۷p߃Vr83:ۨg` jp?3]Ϊ5![uF=_NZ#Kw΍<شQQ[av ̀f~uH.oQ }=s=iFu%ӳ8>͝fv!`^?j0o@EF|8Vc~U mB57OICs~Pـ0qާOfW6fc,i(eVۘ?<1ɸo¨o~W57oao~f'q>y>G/d}=,W&:SrRr+%/8pQ*};y\a]]D/&8=r+=,ToO rE[XBqoV(7BoWZxV{ntc=ߙDpfHBl[r*O0 / o#N2ݥB y\˼ yemv~s3rܑ5nMӋH qdc S%1,`l8kh؞, @`?&bbu ZNJٌcX+,oO:I>(5[>$KcL& ,(0h$PʄwyЕ]Rtve318&Z+Ne[\02s~!9pcF!!*9$5No;>i5\0k8,>uرYZ[pGEpqBŨd.'"pqۆznQlHN@S֩L/MHeԈ 燩oZ3*+p `*[CY{7d7(*b/Z\*"aCXh+ߙl>2)ipv7gO8| ȋF^ڣL8p*ٕ.9P q\αzpoQb#QSI-ʲ&p&pv6qW]/y8i4Qr $bD=1*0Q`(s[FC {-0'g*0F0qY+d890zs1t bbuF^bpm 4ܢ M|)憉P0ѡbۋ0hntƺ:S}^d<]H6̓mGTm>U?ΛCu-C$[`N6vwQGP"$w^]ÿu(8tl`p\]]joS9ֱ̍"8lAF:AȄ^H>aS\kc$s^N;[NJfI 1ەwb]y[+>sέ=%|Aސ1Ca`+.DO"׮ 3ف?6a#P'Mb[C$5:u];/%*:u,qrך AUTk!F k͂@Hzbd点p6he 2A )0#J830 I}]\-pB9`^/u.|p!^>֊Q=eݚˈ?3E~NQHQ[e P{ *w6{ td$V(21(R F|W2b{jG3g^:[e3a4F dL$ Dea5yǜɸ%ZBWuWK Ue3"QwᲝEy),a wNvjv L1aH>.1 %!Fc:w}y$ YW(PbB$#BJ1~TR'=pH2l4( ۩!t~u徝TBoIG]ո^%ƋVkȐaR'3qAkp$WhŒ8F) |ƾ1UzOn]x{=)EO*93!eLH=ioɕЗSv݇c$AØ bF6#԰[ϫ&%Q")U]M+ qfWU`ƀ}%)c'2$9bO1Zk@MU_,ޝ/w)aX}E~iGۻooF'jg@O?azchc}fO]F.RD38qD8+̲嗮c/Ƿn0ъQ0:qӟ#^;!Ԫ P7a;c߆|zE)b8Ǻ'l9۞)x&L1M!M3^CHÉY*JvPoa|6ў41hn5$t*)TFPtN@E۸6w`>JFS˳}YkZc o%?9S6TT#<*% PF+0͌Na,Apep,0$89?q oa1vlg > BeahI7jCeabi_V-)EBHq>YODRk0- h2e1#ilZ|?X,5hڑH\ Ot4`8DnliFPdijo\j?^\Jk; eB6 ^'tƻJwTQʢ(G;'ĵ$q&&qPG_'N( _'5%f Q.m >=cH~Ў\#!޷iPHeErG+b%߂VLkB<.hMĒ\TΘ ,xGT:˕YlT/[M< [ @u1KGg7LFQV0çDUm==COk}Y_SKe9Wɚup `>y}d_/SMtQEjM-pr.3#?7[QzqTBcLf۰g{'Nݙ΍_&sjzAf@d R֟U_+$ߠk݊uԙOLv]&oV_d5pQo'kM{7+ L*reOr_ |{ONi6^/&@/T/+`ZoxHä<%{ r`Tvi/hy$~NAc(ģH$0 'RƈF9džItծ4aN :A;;f{T'Ӂ޽̙_۞֒EY4gssS816徊N_7+ڟ=0ϭ&{k4!0 fӦA|IA10 cp&+ki.dKn2ID\rY) 3oSWldK~%ώZh3zTCU[Vrz(v:)F}I~- PpN 2N+jFesSL<;(bgJ!g7ˌq"Ao4r; x w$HBe A| HhHhkh\tq)=0*#IhKLE"̥8 ia-bP$VbFeB f`Fa܃z #Ʌ%#~,f`we;; ǯ0 &|p{E9ۡM@%p`EL*ŇQcYG8{{dI8'lTxF59^@GَʃB &LBPJ NC@Zk)幱|S5bM&wsrT[)ɂݤ@cNpr8'RqǏרcú{1+ *CDТXtgX"V2K~Se$h,.PA*p ŘnH[Cgly*/\W;[╫` vr\cu/|hWYN2ZͲ~NXD.|mT )1N~X= 4z8ۋ"u{)l0ʨpiu=bO CK439SQQ.b0Nu^!F(D%1!x )T83(4f9pjx],f9Xx@~%+EaP*[G돷=`[N MQ08 _Jh]ĠX J`D , Q5.72BcHD?ۣW_>G|}VDa@ 1*XDZ)I51s ID 5Y 49b%JD(wIyF] wxr1\YIU7q̺4C"%,&T'l 8 ij(O=~e0|X"wՈxTkv:'f6Fvb̆Sx/r sUe;?1nU$0<@vjm`屮̹WF>::.!x%~uC)˫>RH׽>- Q22@.s%3I(M6׹i"4CL "ǎ \tLwK!B [ 2ո|j5#^6=H{`TFQ! _ʀR#FeVq #cქڤ SZ>ρSKQu8JҢMcI a**K0-QW=,QvܺHxm^!œ$o$qL N *%۔\: -\ν^(Na .r?#}c=0*#gtǵr{rzwTelEͫ%i}5N2rWW=~G#86 a<`ɏ(α><\֕|m Zd:©؋=|e p&/\伆~s|IۃWƈH:h>uq4h^xWP󦵳YI..g3ZZ@*ch*ev$G=eNg=-JѶEQ(NR=ʶNS})8vQ.mv{SJC2ZpZThhVK,0FD@.f*׃V^j=R\'$Rgs<cr qE%CG9v`"k:j a]`bGCwǭNq_3? #ΰ0@yefI9 ^42FQĬE0]è~95Zn:qm搚t1uH)XX`1h$ 3kSKjRG N/TȓCnhagEY>xw_/Tp/ .*G-@Rqs[ݘzC6K416=M+=aTFyG]èxW_U\.[7ved 8+wQ;+<|3UѰU#j$}f==0*#gpX%YP=@CY1XOLQ]'9m`hKZ y({`|?5H[NN{0lɛŤkv>qbgW];O@Lqk! H*Jx$6 P: eg-cθcc;k0FuufgO^ù'zRz/gLl$Ԛŏ-r߷?xl)&l,_ _Z6.I:^ ̡ØmNJY jN? bgM3۴?E`.1`do׳-~[T|֘wb *4g%n $Kg=Y jm'ϯ\oJܯYVXOo~˕roDm[_A%ZUl9щlHr?]m^ˤJJN\~g@j/W'`0AO-u1aTok b,$}VԷMn^fW9f˂MwB#V.x~[s) *6N ߙpqZLO͚6OẚT箧zBըpL>1󸭺dks5 _x_è(&{h&3MF5Qp;u;Vo H-_4$Y|ZܲMaap p#=۵UМ[A`]m 8dQv]3 9 :-kPLP[A}.]f]#0uA9h:f`rG)'LT渷ujTmk^`;0Sʀ}e PAd[aX_j |L۷`53l].,``*h\W8gﺽ7{N{s- \XL\ׁmk@0* v@`Emۂs~+ݷ \1fP"׀v=:n6tۢ:pXx0: =F~'/`qsmir5o7x2 U{wmvÿ;[XfMm{G^ *>Ӏ}WV?k%%L`m_Ab,F5bWg@B <6_KÝ֠:({M5'n&6@"n1 Fس都WpqǨalӫZpyj [`zAûo n_W7sU Dм-]m1=R,`qH lP]P/ Ѱ(3<м|xp lӴ2Gצ;bqס:{ \Y︖ XR5и,`kW}@;W<@Sc}+֧DRa幠X;6PiROq0IR,"@}4\,[6wf.}U"!ۡO>/R0ϛ{"I8jjU c1xpH-.M2ܡ7lV֗ 3{?m>ğ0'"$,utr/9r66Ÿ @'^M=!['to_TOdY` ωl;!y) Oп0I,,#Xag9']1a2[E'l75&TTe5+.jJf͞>4ɦ0[bvk,P..e@NiROQ֗XWc x|QWỚ!)|c^T;Pd/d)V /u@36ʾ Dr=d x 3ef^h`7`Ipr+i=󉓟,x ?k&ಞhSv<;aڍ2,wcAwhx9i׬%r\EדΝA#H3?|>YYx3TY7hDЉCySTJiQ:  \Bb*{ne"S N_ šH61,B}c8OyLy;Iz1 A*I2xR *rP3d6 /\*&[ e,{33|ʴmyCGk胮2maksM*\YD%;A+ӄ:5xnfmSjʽ33VȟU;/bu[[:y*Vulc|X[8=ٿc?%ʘM>0%hD3,psSA5 }=4{oZeX ~" ._wpimTrZAkgZ;ɸ5oΐl徙&g f^Aay/)N r}X"tDANR9$ Û!nMDDn O7?!Q&I~bBW2VO4v.j7;}luw݋߇tJ*sY1(s(i'P_ AzzQ .\+!? XKE1P W |p2u QğD%Obˏ'\ ضPJ\f}>K 4x>0㢯FWGdA{& EH= ` }i~ޡJ4PpJ_y=gnγDk٪%BrbX^.֥<3i.i.\Ry-%sNUI}|ꖉ$7'$BX.\?Ibx鬉糨Q F?!?P5\ ϝ82ς4̤tLL8y/BG"(ViXՓMj:q<&:W,|X$J{ҸT+숃9idN( !f mYe j1n xWm @xa1&5V!fKW'```````````````````````` `鎫7?%+۴7KdX!QÌ_-}<4r0E ʛl$Y 9:˭#z&,\|29`qv~(" Ū!8T%r<ʸ1TThRj ܓ{ *J>{R9Tx;+*H'CVoޅ|.BD`}ol $r7tqJ2xYn,$EָmͭB4(KmMo誚oB FPi噭qua-guVjNhт86NmS$~M93r߂~gܸV :%ұ`5-SNe}Grqf*uEm EO O*.T?*]$Er.;orq Fxt`,KEӑKyY~3+=!?vxGnd#Gthy;뷸jr_\J4hi=+PW6w|KL*}rh=4sJ6DXHw<3 MufU6B&niUVeiUVeiU~V9`jUYk_ٚ{62l]hX:Ʉ!?kJ<ɡA@`܀;PG9Emt鹉2?d#mr!bgj22 Xh# (La5'8yœWaY۶+iԐ_yn<{.u<{6V-K(; )2")YN4:6 .boɒY8ZJ 9k׷Q[)@ s>^߻nvey 1ד1# S).kL?} zefhg_$L: w-5C7*5T*i cQ7DP)!~:U2Y^tv/ϦC)J(}Jv5iգ(C* %; glNˠYW(nP'BJtdْ5 q}<<XtPR-tFjNJb$ e/Ip)em?LKC 0%FD]5E lp]'T 3/t˴ kK{hŮ WYƪP*vЅiDOŦǝ+w^^Nw X[EU}1@ajL6M^)vMA3.-(OOdzxEbS&FaՏfc3#lo>m!S{Mvz-޶Vcta ߟh78"˯[eU.z-RW~0ؚUW{SYKYKYKYge5Rsr7.j{FSݦnM]u</Y I^S~u/۱<óIˈlu+ 0NsI]ӋFdKR~kFWd 8 F;)SxvtM=>;|s쿉[h+qXfW<޲ppz 037gUt-3 MlϩHn6mqO3ӻR,f:=w%E )8ӅScxNF/_<ՇLSgX`.X]|00zk¶0'3cX4=6J,\: ĦVi˵َʑ0̩5iY%ʙffW7lY-cȚeJʜ ;s0Zq)B6lSƔe t "CζGT:L8ʬ0&([0*>{6q>6if(ӛ] vaQ.A)=Nd0/԰i&ԳGye,rflÉ| ݋!'-kcUv*-4h9! 'U.C|^Tze>D}SSR|kR$Z)$Ϻ@ EЛgQ ?,@oYiQk\tjAƌ5漹 t>>7mME 'ZKQ_ !ٕݻbv5g!e- sVloMlӤ\p J0wmAFհK S]Ww5IK#1YfdW6)/ʽ5)&I"zF:FUKTmƏJuh:8Z6qŞP+?h 7o#߆aIo*] –+GLZ@TBW׭5j{ئ\5,3 60f*KjӪ#?H9ؿWcčDJR:R`R4wEBq4/FOo $e7X|JS*W џˏ2-($<ܿn#TAAZ|-إ"e/*Rl#STts {Ej҆%~,Y1n)S  tUF hoz_<8;h^#[{ *ݻ÷ɯA,Ewߴ8B0_/3Īj_Q#B>"ɰ[6UJǑL;+-6.jhv k~J9B3ȡ#F+d!7XV>䮗D +8ePUa?{(5%&qiug^rF֔nK/Bks'*vePl2H-RA"CC,{_G5Mz<,>nζdC'b]ZepwҁtqZNƕwtEF%Ne?֡# dbS;-gn,+3v81d7~~qG^7_4^!\XO"emCܧ<4ӮJ<jː$,¡6c- 93QfYe>Hl?+;Nt$s떤7qO-V<Ƅ!2pzaM4Ϧԝb8^p3xxdW 4Qisjm#mBcQ1pd:{̎[ iYO?]ti:Pܯv3ub9k:Zȝs[wo/nm=y[O֓<z9nл'"Tv>GYL`8w#}G2$I=:?3׌@B/ĹW|4bKmrfIeiZ4}2į^A\DfZ,w~O.Q~x_Ԭ%3w+Tmj Fr6m2sFl"ݘG/9iiimpLr=dr|t( Vιn0P+),t9Ih @[hSD#K3ZB8n` ƚ|4V i3s܀0y}Öu9 븋 $u]Lf@6k ԍmHķq}$==3sp =t]]g i04f bo,ņKywM V3nˆkWPDRi)^c4Ҝ+ lša[sB n> {!|װC5MP~]02Ҩm،"3 K,;1<+0Ac.&aY8jGBv^^llN$D1 qukQ7A<³C=0LS/60͵)gNh a$觿zAp}Ku۝/qv*tp8eQ2x~BgQ/el9f o諲Պò3Kj~#}( 9y +e,?E{/_UYq \!\:AcaM^ }Ya!(d{Pno%M8/nެ]s6W8rihxȌgg7ɝfvfýw{3$ndIdn!Q2)%K2҉y9HhU{̽5z9S4K,3=!Q"P$P2AYR#!*&$9'1S v$v2MW9pn߿Ǐ_7~x[L/~*`3gM>S }r_?/Abp |/nWyxvN|9Lӛl7\q}߽}hQŭvD›`fҽ"+&PFL*\@3Ӄ1xQbo GS\ ^IlMmR,y,/*S-3db^/V_O.FUO#T^% xu6(nU@Ӆo:gPcp7Cl~ b3 J_zLvU/lkQWyF; DS:r`k+| D|> ϻy7V遫ྱB㙺ŞpM##a"qcaLHUR"&Fi*c*)TD!֛Gl?蜓D~~< 5z,A[FAgLf\ YZi|PWG_/34JEJxZ#- 4IjB4)Ĥ,fX1%kdAײ'aDjiL2 uYL̔H3d tKk"RKH5}p=1c'cQPcaY4!:FRҜfX`õ8He&}Xs4i䴣0gg9MY57 iJgq3||qtKzZC\HZhbbC#*J 7ےqP~N޾v2VW۷PhCÆ}d_\Mz-XM>߼Nw=ow8bbFMVܴWY&xO||#Q1V!B6t)#M5"$ [g0f k;Ed풨*%Oak^9j8›gc阗XrpA#˴.q`ʹ ]1攑彙7Hr3¢Vں)׾^feksɕo]r{2_LyfIaD\wСGYȘyB]- _[5(AYp9v+3 &\9 Z\PA|O8=TИ o? ,+.ߞeĚP>`f'I$wgK!-[N:?8iEl񴴌,zs,J' sYxr& ӹjZQn#[=ޗ ^?o1_>c?p2+"h_u>kwrsW `BvrKQnBHP /\7t' tCaNtC|׫>VtEcl:/ET9SKyk!]C8Ei9"#ەC佫آKN-V Ylymǯ"c)&V!8TɐJ7#hcN#: Ə g>>fr QE{h*{X8;*=02cU"z{E}O?/Y{qCԎS 7lR ?`=i>d.A8f!ItM|[Voχ؍#ɉ"+{Z.s{NPkL xQNj7F }.L+dW%5̲ᎇTVJ,Pj3yoI0 ڤݘEL 2En͋*u\0/T)k7>ЉG_E_"|Yyɠ?q87 4*LX;n*⃧ڿ=]Hg؍;v(˜ݱ}D~#;e'qٶmS:H\iIFcSIxR"El)I J,N9Q('ئޖwZ)=nȀ{ufy'n"Y(~SjĚXYĊeTkE$u4a IʐBB000!/Ku'Q37$ y~61F-W<ƤXZ*%bBLl"1XZ%*(zR~*X_9Pf]^;KS0X}<|U>T.5q+oڡIntDff4|O`W[TgͮuX1Ҡwj;޸n-~z\O. `{͑ %wy0b6{o8z*K MP[6]6 vʿ}*AɻeXڨiPx]:l4-dKxAmYJ 2.+*,n|͇P̪s(TF_$S:;eBT{/UXOj/yv[" |g@(2ί&3GP 0IA4h=s (R Hi U":^L`Jc1 iRaI,Qc ιH%>(gЎ״Ng[@+Ԗ?_6JXY{֌(g`&`mV>/f3%>R*`҆D)JPE TRN"Vz?дX6yDЬ[G,ǯ47/e:.Gp_2/1+)q̄4HnՒKaqkd$QͰ&:}x9rBw]%hh 5"Ȩ{fb$Gf}kXʩhgAG9D*nC|5[5(Yp9vRٕzޅg bV*%".?ǟ^qzF[y6ewA<,xSDc샙}Tl˼j}qZhkbN\>1y&UwI.dK8H6]@f]g?6jۮ밞{]̻;5v~o2|>;'_}x衋g&Sκ6(tHpvuz x߷}ﻙ{OZHqd[u;\)LkIb V Q$0ǂJDKLJ ,UBKA͒)h>~owUyq0!vAH"@3GM"!Ă  2T )X1GOUf&&q|k~ xDGXbiEw[B3/Yr8z6/@g\`ft hںZuOs7͊ML*ɞ-|5],kDa}Ͱ3-a2ɯnR;P Lc,Ϙ='Q.x1J$Ҙsͩ*̜T s#4BD* ZEIҊ%>rzR\cN]6"t]ý+AOd׏pQ2cu|)U\>zwv^Ʀraj_k.r{( j_sfgztbY{m'f,{fc=Z^VL-$U{v*4Ծa<\\AmqFe C/Iʭ{\+sXTAMUJMX] <ɥo(ܺZ;iEks" w<kUBe:lPxI1o٣XdRVuq(:օx0 j tcWrW+.DPԟ8y{r}|٫Վ]A&,EpY<tL t1#}`ws`7B}v.p2#&DcpT6![a(JP$21RZh/샽OzJ,"twxO%? X|Z)L:Jq:˱W]y>^i߬]c2k0"wySCy {ǙK[XL>/ xhr1O7^is=p|9&3 s}߽}hh+(kG" #so .~Ql½>zg/ c#Y+0vv!vbc}6oO+dx(Q[=C9#c\5QUwOUwDźD6o"(..+/vQm-n={\}omT?J>kn!㩕 W>U=Zxoqeg0._lθL'i{o;՝/Sg˙1[?ց9'E`Mo_+OAvP>f\ Y/1&X!Bu厃*^۝%?rIP"OH*)?`Vi䞣d<(!%w%) dkMr+P9o=nn@ǟg7c}XMim&N;?!=ZSe<3z~5sΞ_zN`:1 @"Fo#OHR9bQQ!.f\$Z'-$b(8FA.:C |5UŻ+$"αE: Q 0"'3J 0 Wn46o!xuɂ{gcElyBxRpFs%( [H((V9pPԕNDB9\H(`Ԩ2 F-b=6a% a̷P*sKɜ- gdq0|ԵP N˻#W9U$ƥ':yi NASakD"9 `օ0YB s~+|>FQ1+DL [H(0K" ju^4>0tXվa%ե[^h5uIm1#'`lTxPCE"y+p,yfA]àϮFBҔHgI nH\0H(> \VNC6T#C`Jyn,oBB g o1:A9,(PJqGLDyH(^-#$\AS!F$^JoMJOx`K"x4>K5x1Ϋ7VW6d: #KqBasb"lP<-I@5EDctZ"U`R^ %+K+<fc-ryKB9,Mm$?J[(Vx 8ʩBHZf6Jgw@[aH0RQHAk#0xYRRy&yrC<b $"/TRlTmTPZC*rl?z<SuuPZlI 9rA&DLQ|Ei#VUMAP,aT@QH(ވSF#?4Xb2K##Vur-h#W grs>v;V|m~w~Ka=~wA $fcU.%=PuK; $xJ1gF RWb9*Wl ʥU]Rm(ǩ($ wBADT Ϯ,Fz BU֜aqRu҈oڿ{_؟X~zZ3@>}AI焳@5O9sg{;:|yηJWܼF#PU!~0f[һZ>~ɬv"hҟQ>m2Rok@A{sv;9nrb -=>u3Տ߾|w Z_Um%ͧ]H;uvʁ H鼼zObP"m;[oouUMoyϰ|ˊ-8ܱ ne?uwFP88yjk!8^ק?23Mյwz9 _l ty:N?W|*;uY/cNy3~<ް?X^}*-gJ%p*L",7}%{}'P*wu6?{]?E"}]ՅϤj/5ۙ;A&DCr\h̪3xA-wwo A UW^rI7;pލL?+6hWJ_qڧcVxr?"zD'Uu];˜Vʋ./Wkk㍁ru^Po; !xj7·6Pd{1'B&tTLS6pO3pt!k?9mNuqR*Su[.ʓ˵Vn 'W{qw_igi%62nu4[mMr=)XZ /.$77N*B. h=ۣtX{h/36QU-{f{7,iet})S z/w*-`/$/{cҭO~O1~t~VKWtkMh~YwNeXvx<2k'N?ρSAs ۓWۓA5S1w(\LFY^CY7\M+7贋b/O }iqJLzǯzq{]Z縆mհ:Ɲ<:'plPCJ(Zv~yuT5< #kzç/2{9O){ܶ &mh;;>xi眦3V-K(IdaRDDHpX,ow"9(T[]F4%n[@}/GP 샽=N&޿'.|~N{ pwo}9{?M;?| ?́ vFw;g=r(E_h%~H:Aݽrs)Rf(ݛ Tw5Ras;Q3͹^'yq-Uzd";,Iz}ZlSڸoxBbN&7a6̳,dh,<hBB!v\`L:,rB 3O)ՉF^0t^ 53`w?~EA0(ZT)B"${r=x$QpF_h6mM`v#t 1G,v+=Bv v>6<3h6o.Aoa ̓E͗4_@e{vE\~zN2>P4;x5vRlRJX*6T59W6* ޹Qbf zp{؃AG};80H`  ZiB¥Lk0&:@Ljpr-GV C\ZPV鄺x) g5U 1rN 7CN- c]ۃaцYE6deUl{bfRIhc#;"*}1p7\fC1!O1%#(.bΈQ]< jLъ`$V3j!rD}@a`s.sIE\y Ae[vâ'Y_/NO/R R;q_Ɉ\{X;{st88f?Kֱb=‹E5wHs}C]9f^[b'$Xg <(¥"r7!u+.1s1=(C9zS ;-`P3<y~a󋓳YflGfiLKD'ɓ_|EePT\WV~M^ 8낳d:WҦ^N椧"~ÇTr*2<%|[ Ԏ0vQʅs7:`,d𬢮 *JA8=HO"ȗAq? T0t߰.N<""wAHS `/osxv|xqx黝w;ܽVcsm-'F Ro)P[ZGoSzϲ֣GL)| fuWWWժvd~,F-3^>z}N;3 W}Wr ea(_ܖgo$w_L`U޾)+vga0vsW< 3Wc=fe9q_ O|uɅ@9ђFz*z<8i'8s$kIB};f2B:B'ej̉}{cP[VՓZSNBgG:%2Իj6i  UbJۿX#z+k=X֍Ѝ TSG2:Lw¼tdRnz֠ORtAkkxXt 8 M!q3췃!GK0 >VͣO0_4[q Cu 'O?W3Þ=klfτ He Q }juO)iL/ ns?C1$y'4@W]e9Pck%U/Kw`s|ZVw*kQz־f|Pdu/.ݟq5rT'A9]ay (O|G\Tdy'UekzVHOvNuo%{p-!( }^PB*Jd*J}RQ.&7&V&=^%X׳Ә:NCuf/~"+= \1iA;a4bbhм  rUmԲ 3] :V e'3WU3U ?v[f*eC$qP?An)0:tqZ]7v{&IelmٮK 6c^:XWuV3yz8BYJxxhGoQn7v(ud7~||XX_=~|qep=1ӻm봕sݣʲqv>ϴZґ`W$UZFU)aÍYïߖd[s|qSgѬ__W=(B[o5&ᵩgKUKMwFrK-rr0|³]e3Ke'$N:dFF!$f!hSLu{8JBP"_i'K!7qrV]+Ls2LϫIl`waH6=H9~X){?blB gd(:8pD?{hP-wߠs٥4 cűO*G8ݻw {ʰ%+#t1w|Flct/gByDQ,mPNBXvTIM_xztW8y:@aӊn0u/aj\Z6(Eb铋[#2l'H>)>񳞧wu2w;7~Ja}b3ON`JlnTVi)ө\_'@LiT[AeCD:ԇq0rƔ1"Чr}0qGD!R"aEN~Cf^yؒ\I>f(4ϪVC':uwVTm 30 >78n.a:[J4%tz'c<{6t@6}6o;WM:KM/ɨ T+5Q& 7Pq$ GUiɔìי91f^WcZcS&;;M}_?^ :xt:R\P[iu}7}v̒ ԭ#R;A]pE.+𶓽R˶3+j' FM{4Ğua-&!F@p:&$`gݳX@E{R^[xxHk4a[UImZ=0XW^+xo q~3?Y8;VLy'lPg&JNi(:~yrz6,@줄r'0,@pTfr&k?Mxrӎ.O0h^f:3N8Y❒1i(J(:lf2lO/Fv<1Lc)`M&Ŵ|dǓמEko+MK\K2.v q`uW%&ތgnY3H)9w!5ՎAl쳭\j : PT}$nดʆ~GA<D96B٘&T([QJppk;UqEʟz1%!Xe(r]`&"U{3BDD V Qmb9)jD+_v9(l@yHA&lnj[ًq S"K;,/B6wEQ03"rPޕ&sw}{AMȺ9ao(,|1ϒSMq?|wݏ, =[ 1p@PI*f $ss) bR@0i$#W נ;mZhF(+s•>t@kVl6̲ovǿ\!BbNۋ"dc0QَOS)dke|>I%z^e! `O9"T G> UHn 0p$"FRdq W^|TtQrtxؗpFm /4S LbÈl\fc'7?K Ƴ႞C_ILIz{-e0l2_!by po+ m8nl ;l h-K${Ƴ~~Ȓ[lsѨ)6UXԜiS%?}w-7O~2 /h7}Zԉ7`RZ[SܔlwPcq?φ h _.xk@9S~->P~TNwOM-*xǠeon.IW)Dy@G5sB-̥8.\QXDڤz֬e3ŃdWTGǃeђ˿Wp3cc/?@^^Į2/[iv9m[ tO|8Xf6U#vsG&>ӹVXհdԝþJn_VK wEW/P YV$ EDL9h>k|%LjyHN1:=V)nŀ=N;[B7zf֕Jf ^I|}/dz3j\>?|JSz:ڼ4mIѕ.gfHfrd,jGbhp8%bbi%0 _v>E~O]nM¾b\¶HL-6\L~ͩ\e3ːFY~?]h_&[,Nuڰ`,90P~Ԅte?V-Lb'7΃ )2ăш \X'38lz7%X+Dr;[==HTNI4j^%YtbG_5O$TlV^"K_l': vg|=}7vuF׽2|aeӗg#ɲ7oΒoNZ-=32o.WrI|7>Phx[S(4^NiRq*e<_V;s ̌sɲ.2S)C'bZCһj.NRJ63wdGڶ xi xtRܒ q8v87.{k~-w:#;ghg}x X19|u9+ !o7h7$)'cMO;+ZN$Fv) @VټD6KawQI0ۋ>1ԯp~qEp;Z e_٢`y3ގS~96XaaV[f\ Jߗ_0䨮|1DYRiXp )YV:,DA|ո f,9R.,,mUV)H~ OߪEJoo|5y>S`B<U*[X1c@^ˈiDk45[!-WEt~]󷅺 GazO> IKB9UaaRuAk؋&_lѰ8E;?́{*@Zָnϻs%RqH1).}Ԗx93^{-!s!`mJƙU?|Ymn'Yud~TPE#&h<)$N0+D{-JGt`ZDbB,Yd-8,r!c0)fe ɇ rt-0!|aܗm\,2" jbKFbb$S, A9p0IXx_}.|vܗ-p&K~-.p;X9C6' xE介{.՟G2` 䭾 ?im6J5Wv6L[ nM3 E)D"ዤ4@Z`*Veo.Ʈo[+_UҸEɏl_nThcv<Բ=wJDq]*u/@v<jbs\4/:I]f}c}Iw4ҥIl$b%aȚ P_Fh>(R+Dh^̉bb(Pʍ_FέRȰ, Ƹ#kZ 'RȒN-vϻ dnȜ3AI՟ڂNNcċ&k%/ˇ@h0onxs3$xIx _VR3eO z߆f<_0/LX"JEh{@&zGh;ih<@a }j?- `.a=FrKX#R3&6x&4N6ۖ%,"J,E“XM[NqXt l+-6qlU} ц2>gUv+?N8diUCX?-4pXީ{is mOa%ʌr+e~UſkXDz.}UޕxW|&¸h N%GFxgjIDs +~Cqݟ|G%/, o~9S2jo6?7L,CF3)Jb?3*6p8Eμs|w0_v:QAZ=u=t:q%VOkkͣ}qBoV׶iˢ vLc*V)s Mǣ~L:"o;߁_,ijjY &hQR)E 0k.]tPHrB@TwPVwbE4f)]W)! `Df^KyS:3o - @o!>e%2|bmXohG8[gkV :'`Du|g0t%LpyW&lkwYd+qI߄a&1o0KbUN y$Q1fa. p[(,"HX5]Wt5pj]^M0>vr8wOMK ū;tp&P0v%^R ^SJx->c;X{҈U߅/)ˋ[a*ޠ7t1hmy7x`j%m(C^x )Kj賙>E'U],:J$9`T^v{z0 /W~_f!rF"U X0YJ $>}6^Lrzݾa|u1`}a"{&C,=*SDcK)K0,,xGI4 GD%.vf<@m {w؛Zѯ)ͩ&+?gY< i<7/ܯYMet+ fȂk]G)e3aWfep-!e(v!fB@2Jf*ӂK`a~B-=Q6kB(kؘBkɑfD C\ݚS?͏Eee=iHUQSebj©b3 -P뻟 tCÖDu].yh%%کqJJh Ju:'R>?P~rDI;&]a9KKp ":;@"Ӹ> \矴%bG_5r xm1I%oë[;"XG"`"RSFDH@w` XRRKUuK bw&]kɧ߳7mwuG׽iOH+&Iݞ7=sd9Y&Yɏ0WݞdA7kRwavJ[[Zbs7 ݚSԼ4/OΜ33cf|>\3gf'fPG֐ګ TAR=}0Hv!O7w$ef)wNRn%*m./nHӽ8$&ikCm-cjSGY P#QFԨ~}sl0bʹ%Ѹwװj4<ܠlo|oaV{vsKgƼM)4eRT]$7^ԪU{NPiNYAtN2:}4/\nV<ՠI>q8v87I{k~-3~Cv?`n#8f9_ \B1\*ޠQ  94%6*zyJ-'# E ;O `cKGrV/E= oӵ >!s[QrEE-tdͲOnQeڍސS&z$6lj65,lST!k:+;/eKJ%b)tkeY!d2ۗ\[W$0(xqJ6/(s]T[}u -|?i^+1QfQlf AHT:o;cƌL{-#chnDtf3o\}r{ztJRj0P5jxT>s>`.0 $- T)JY*Uc\5t|_YN_1@D* k[֐s+RtJF&Gbl S\-&x#32x5sf"0|'lZP=gq۔ٻ6$ .~Yۇ5v ~gfLqcW5CR#gȡ8-v9 nk,fz F P6<5(e1 y+(+jfSbJ͙*O$tlO~}D+,JGIl$fJh 4,4#b3!|i.S(b5#ZcPBfRpjs)O`9.8#2)yfY&44 +ͯ4Ҝ^aX[.V"KjU9]iy/O%#GRZV=([ޑ]8FB䉐6~R:8y&fTH c0gy:3ȿ Rt`[״.peJhi X2I#2D<"4MEs*ES#,JrkyYp=\M*O9yOzhl&3to߇7X[tMDeKŰk5&^ӄs?ﱯ+GuL3YRLQ}Q\TNy1wG.EWy>L(weİ'UĿT6?[m=߫A^ ˓]S,_/ʽ/?_|SɈ,'jm WsxAˠp,Wuʍvowxz4ߊ jYLe fفc\ǢFf a34B#U:`Q3_?K9xh >DX}lO8[;~;ٹy/*0 '4Ϸ'dfgOrNOOfmrhYul7wZ~˖kO/Ӣ3^n&0o,E[^԰}`VUEs7~3-^MSv'C;-@7\o[ 9 .tF`\W 2DkjYҁc%I9CY(Ci„שMAZG4qqc(MSchL JbkSELD$Q)Tbno7#8VWK~;Lm5W*BHJOѼ>N(0MC1ˏh@H@apЀHQBX%$k@h@+9]p))\F Q\X\P\ !FSkIrgZis=ۂpۧ ]} 5ixnNLj((nVn}]xCq(XQNX\4^By Q(kRp+ŧ`X|{~4޲分LArMC$ F)ǿM-]R&,ڈFɬL"[- hr/ `eɽp s%=gnN&WF멿G8̢q_WD1'3=DKP#{N[L9Еһ̾\_̓`i8o rp(ġr `R!D"t2S}ti[ MI9XZZ-E"e&" t{HeV{=޼&7SFHLwۻj,E %R|MC|5g2*ꊉhMJ[QM1-趀3pCz [?b T؋A4:Љ{(v% #;ġ2 "\!]PD{Dz=A]`hp3th95}+Dɵ+"̝C`++thQֳ{:VH&+;• A&YJ/#&/&;jx-شYڜEl*E&mo5i r& {'q &԰Zv3~5_vMrޱ~mHZ&q6LjZ g_Jp&H.IõWkwf#sҋ$_AE1,σ"vstp/ p*WQɯUp S-oѭs_oOBO7y]|n]i=ǛZS Dw2& EX^>$*cykW InqSieuОKlKqSžjOPt-qLR;r՛g` oEJ,]2\$藊bGZ"W^+]A΂W@l.څJJv5e_YpyKϸ5JP&t]p5~g?C1_o&c5%cRҰ MJnrG:1{ &Mt$0: 6RY:_[rBU@9~lnkI:[9ێA2fqX-j}еvwKdI:' 5na16vƪ] nT PyQ3uvp͑.[H YڡT=W- u)żJr9s*i*Pđ0Kd:2[Ofrx>Xq ˭Oǰʦe˓1+e( e`ZVXZj3u"Ě=}AL.oh5.-$Vb i=yeNj:[J]Rk2+Eèl˱gk`nA:~BG }.,O7#к"_eU}M.?u{լaLZᭆoZ\&X{lCVv3w#[,a1+7ӎXYcɵl+F?yvm[w}U6*us(}kpq EB E =I3 5+luSCku ItutŅT:DWظ]\C+th9;]!J$Jiʯw+K)=E4.7h P`*_#g}9ӯncw{>ts};c3y:&j z\'Bٷq< f 6@zXNDlKdxN(6 #gf5~f&j:NCBffQʄ8u)1Θ WLlD{DVyMl)x:wHW ]!Cbٟz:ERK]`ũ3tp3><PtJ+KttDԞN\fU8CWW:|huADٷ M+k+@k;]!ʚӕtN>#hv|c`CMW2~\jVY;gt Ҟvzj!Tqg w -#+Dɔ+fQ!Kg Jg+DU Q-<]]q#xm9͒baT"yY|Ξ-mpދ7ٰx ?OcxiaJQdSC=R;;%!pɕ.`I9nA%t(k=]]1AK~ j ]!ZNWrt=ȸ CdpFBˆhOW'HWF(Cte!i++th-;]JI1xte5%c[bf]YIlégD2no ||\}Vh%9v%Y=ۂ]JisBǺ ߃bvm[Ѯ5S wLOW'HW\YCtuB+DY tuBte8&e4`ʀ* $=;wL?-FdP jH9;)'<|o߾G \ sKiICA(4 繒:a Zn?@_KXZ?Naĺz) *R~)nfGUo/`4;K7=R67z|ҫ=E+?_FO_\7]_Lwǯ+|l|$zxT*ոdj8~-4`FW3p߭PqAZ٦){ܖA?0,y]6W߯oxVE,LOd?{WFF~]`}v2/ &X`1,Kddze$+&,oũ!\<-Vr)JG"`"RSFDL2H0<)c")ys i$S㫖ҥ#4kCU_f~VT\&m[\,M߯ x.,x Evj.+r@e¨4fGxwizʓN'37ThK^nK,pv~L<'3=VtX6/]Lկtrug{hXu;Un6) hu+@K%IlƹfT[d\ r{Q[nwp’E*"ا]LHJ8  X,-,xGI4 GSa.1P(QK^as!,PEWb<t8x^UkŖ]lL]6x .3ݶR"Qd+ՇZ-~ T}PGfXyAO?mb_{:ek4۳g:>7 "pR)MG'(t:(1#-=p/KB9-sʕ"ZY"޿]N`0o[I'Pv02k4ْ$gzxI2{coRBvu@[x^m JY{&N􂞹UgR |f AHRVa S띱Vc&豉hj4BZ":%:Gr$gg/?UX,@^vGEwo> ̂HKB9UaaR"庐V3ֻ{oq]lM_v3*Ѵ\QOY()PF[/fKP2zE=1Nq飶ěU`̙N`ء]{kޢn:eڬJ,gr?l/.tXw~!7j8((uU42habD%$@tY$) Ӂi]IՅX={=[oa iU3``L1h&$Z!<0G1] lNcƴQzHL@:łKs@g H\:% ̏H#!n. i8/UNE̮>j\c)_߼ď"%~?*%1;ί"^)׹F [D0FNj?z>qu@-p?r;p[>tkQRrNt;գ^%P1T)bփGjd~sVk/;^sXeoMw4|1_TѠR4,]w~KcC6NSASȗLԕ)2veN;9ϳhdWch|5ZDǦU9{M.x]y' jPtsZG@ f7ҞlfaW`vP 4OXz?g.,$iu$!]SI7̇㓓+׎q+!~aP}06+ҷs1lzё͍ v;,dp1mmɷ>ư/B~>>?9fIq,qNL:@3-8Q,ݒ ,Zc{1LNK 5+xileɿ{dn@fOzLg#WO  PjZDz_ՅItWneR1-ّB%Lיn y1vgakwCFⶾ}gaa&Jηmg`|=Z<[:)<}&cyy}Kjs|I!ۨQj{F.&bmfTAutya޺RTby-5渮,ֲ-UZ2֩]}ݕ}+zʢ~IsuA'DGyޡ@{-br*ד]G. 9ħ Y rLޔ߾TY޾ |0Y7G?~ pp)t3{5WO_kx6qk%_6eQhY,YҖ4g]e뛣*wu,<(cA() $)*Hꬫy+:y-5B$=h^GOգoKU)ʃM jRAQQ# {ǬrgSt(YyK45 9<@l2oj jsDo+}C3 S_zէ[Ƣc\'Ltҵ?y,![nu O4+|D =`Wz {LLryå O=cF@tE NVF.VjoIZ}?H)=_I.GWݶx ;G"(-N$q3Mcp1Q~q[8~W}yd/<]uqo,*L IQtm/ڴ*&4.00^>S4\Lto{XVrUc ?jQF;vm{E.rnx&uwlX-{^*.|Gބhੇ:Dy&W6/_Pp {G|#N ̀>{k:ѩLrJ͙"<ĘaZY^ZHߤ/ "vk\6N^ƶ'BSΉ>9 >"}#m۪kBQV!$S"psn1g}yԓ<NI;3&S- Ӗ胛 1(*䃳 (jQK pR/Ɗ2%4x95qܓ,!iJJpp^tURBJAY&D4rfb_δ'nUC7p%s`wL$zʛi;/>}zkj`S[*ytVK%M}2avȖrsA( jmC.!KF(pGM^FMq,8-Dj"QGngr!ugZĨ#elD 1F GiJ[&i% G0D wk! l1%B# bijhQ\1 A*!lh1:kΞ8+q7f3AMG<vLsPdZ@7 J^F!A-DGZP="CO?3h? $;0*s8J2g*k5$@9$\O:Zn:O:ئ8VJefJ`T+rPȨ572JDDc-y8:Qm7FmlZl{XYC6,(߲ƶ{Q x3miZ%%BN @8ay^(7pB?#m6 y:Tg=)י|c|X^yb!㥗 &H_Ā6hoҎrNV(d5n #_"{;_p8!Jr1wRAxprB$E!aEf1[,\.0!S&J;:(K)x,>=`oGg]箚ڹ?`j'-òRʻ2 +B e0|voH,D%ʗdn8+1tbLǓFu@QN~'ɓ}?2)e*hk70kL%@}ĞHdh1?#כ2M{-#7T:* @pPBIBsLqS2z u ;MMv5+S=''߲q_GϸFhKVÙ j!|o.mJ3"'a3mC $sqg)SCqa\Q)_onN4)fe>htS(b_5g24g: 8F"rpNSw`"A1%x~:Ӛ ̙a YiC(XKd9l+k萁;.wN45ce;|fQʒ9< م)IKгoJMttt8=8\~cAS?p{*o9DR6 V *%/;չ*{vig9]sLx7/ -|pprZ-Ջ](Q8{ߟѼsa!VG51jHk5t - gqS/&K> .u-/G o6/VkۻRJEڲ9b&՞5=VBĿ*9sS??wˇw?|>Dǟ?'ipip[zM^ `Ub{T-Po]w>._ Q ]:7%(8n.]tݬRfҚjW}[yQ.|Q c6 d>ꔼ ̌7*DXQ]>=ՓmHo˫4wfyD%]T* +/2YJ]b&24X ;S_]ޞL/>,[۶mXEmpUL?{;fLl~*z\d2qnr:|.3l)ؚa9ӇA]hPl~3+C2YR!p(oQaS1PEWx2/s 6=.lv{sf~}yEL+ϴr8ɾkಯf~}5]􍶿!NX|4UDjȉ8avRɧ6/bBy>t^hId*^PE 9q)rSH+Bdx ir2 267iB?q+5m^uƞ:?}gWq b)2h.TץIn GtB XNOTRN:)iDtU ]r,tUjstUQZsHWYnň  WUE{h-t(: +zH0GCW˅ ]Uǯ*J{/81!KV\8W0oo}|7/=$?ᅴ^|2~7ٵS񲎙.py?{?TkPaj]_$8 þښvکچg'ven8߻j.~y1z2^dM4*2jEJ]M_-es:(UԼI?7-→ xyzJҚC7 nErW?bd^NjChd!U^g=e}q帮ؕRQ5eY>|҅su};w|SNC F;s}Kg3Yr`)eh'ӐOȗRxqoHĵ=Km/9&jHĨanY[Vk>^V\jI_:_-ӲS-\ $7Z>}vdZ vjO]_Ϛ ^vF л. <]0h&5vW>7~Z3wp$|tɰ[oz/MPfۓ.^205^Ŷ6 G=_m:־wvZl8fӋ#,5eK_I#O~ɏv':SO~74q_dMg?,jI"8XPuKK-|<~oaH[%95SPjμwu[s"Xypȃ6qm}ؙ_oK~%E/330k^Tf`ns23hg :fAf5?dx{|@ m=ZgābO޾gn=]0}_Pڜ0&փik2ϻqpϺh|= DM0p}k#|x}kںջ]+\7bceRh,wHKjreǂnZEkz ۲?\UO-W\12`uCXC6/:Xǿ쿐0kGK~+C-XE;oF zLSֳjk3YC|X|%z=\o+Kla,lnYz=ILdW[-TިsyQ5~KomvaصNvf7s^0o[s?hr[_4lyh1d$tew+aX -[XYJf {xm/H_0kX18{nrs#JCX64•n,ihEkܱ%SP8zDtUX誢=骢$sHWƱ5UɭJj=1zlBfyxuٖôVvAa6~}4FUX'UOW;zr kqrh_ܮV{ cC))I)ȡ^ҳj-Lz =۹bɽT<էN^yoI5AV#J+:SSSl%fDtU; ]`Yh9v(iF%ҕH܈ ]U֌[Q]D1GDW]UN3:v(ʼn^"]9!i{K;pi4 *Ɏ*J}:T銄QfLtڌ hFCW@kUE䉮^$]9bbTt3dW1CWĎDW/6vFjJ[ȼzi}fQvɯ큷OʔPZ|{KB*N:n#UuZ]uT{*oޢqSή|Zڴ#R(avqW=/f:.zIA97/vFˍwH7Kbޫܷ&mC^x-F트'Lⲧ;^ƖGeN<"Eka܏{c?6MeFG'nF:G]{8͵f{ϯMAr m^^(W\R J5O"pHBA.yn"RGEo>~$C/o4Fq>f~Jz\\q Wy#1$Rٕ`[Yk)uҨ`i&O9kݵ] DAq%K, R2 Y=uo}I$Ex0qU,K}}]Ԝm7~*Ĕͅ%8# NZ8U^kSDAD'ZE3 8K2 THd %q_'Ic0haZ'z#kL_ARuR)Z4h9*3 s-V\X!˜kIO/킐`F7! 7Cc 62DQJ Pt"ˎJ 3c5'@άKMBh*-}*XA /˥s%/'@auHI<1o2sE--p /`I ^hğ QY Qe9eDbYF('6++ƀ Y'7$FL&;,wtc## }[ /=:DRK-,!2"X !$Z$T-Rgd0!YD,NOp9h\2FK;aC%Xa&3IHgxED3\%3` :2H`PrƋ@ 3G ^lʈ.)M``"^FH+]`VrH4Gg$(C m>0h-S!MGܭ8U^4T|U`R5%^@2hɃ([I21t@l%lPL0VS1d( )De]`dV!T:2mYhR VC *ZDx@ti w9AA)osB1v! NǢd2]<+0Ϛ+")ޢ6iBFKl`ÓӐ`AV#82(Cό{^d[ P$$_qȱtřG0˥trXE/3%(!.c@BlTzCr C+q2F0u[@H `%e3u R*QJG""AvŘĀeKRf*+#L Qv)Y"U rV$n_ JNg1(J892b\'Xd5pbMRl iK?U Rٕ 2I]>h $hzCn@NWH"F=Fղ68MqHB]koɕ+vi"H0&e W[Ȥ<\h[Mb7OW:uOud9Ppv؈,}*Ҭd :ҕjJ2" ,ÔTH09 ~;X!^BBgąs՜J0H1A>xmBV5 tq;XFi/! O՗UBd: Յ*oAcdpm8 U"j1+ѠKP-"VhFy¶AMR tY+A";H?vM~0N&!TehymBރ4"( c ѣ0!>h߃^uD!*Kt]/s&#Tuo{Hf*D{"<{%lFr[O κ$!(w :DkP(nk(i 5g Ee{}WA?МH&eGصB$d(&J+1!e~"Q y㠈Bõ=`yh횠͐I(g 5?6^ӌ' 4N #`GʡmM1WCyy܈衭BD{NQfuH $ hT) SRqX շQİ%6vł*OW'Mm2X{iB V!gпayDyE $LO("YT1"@jQ$Fb1wv>-P'GJȪTC/B- S{rc=7 =S'kؤ}F&_T4/QH&ciP Z،P-E~]뜼^NK }UP=xZUw֨UY[(m;_` jNaв¦E΀Ozʤ蹐XL~-#fC 7u=~)lj2jvvz*?g Ez3*^\>ѕهz5Zԛ>tuJgҕVW`'6 gyzBWpt壓F3]!]M]uÙ"a0oIytE(dfJa?Nzn'npnhzmt͗]J3]=Rvf0tEp ]P*tutR;1wۉZ}4 RE!Ar8t_c[= WB0]#]%+6" C+B\/ LWGHWVYDWB}T;u]tutT(7"f0b>]`v =y2V$箎0'WXj??L&W}׿s?'NCG\kg{vv|t"@Fy|g2]NuM⧷Nң4ڟ#KD>gRMB7JW[J\ ѶBuzcZ,fSq-oo0 ޢ3{&]_4oTq2L:@L}9j>ѯ;WJr~s/kt9jigO/5&dgr6] *l~~/K-_@lo?Lk6XYD EdѦ-^Q`GNY= T[\]篋;󆈜^M'/->__^~wsCZ_k~yzv}Swws=O^l|t=^7rgGZnS@%F7쿭nݕqrlPXo7-N1Yj{w:)W;s":tgIYNe]3:`^ːkK>j^T2 n2T <wk8Ԏ~K7kzYgLdl,g?'ʬM]6wt}#kE~4J|4mHpuݻ$;Cۺf^_̻GͶ~KJuii~9;^[umt-emk{[jsV}Q%H9Og /mz:O4IWBuO L~_.# q(z:xg0Ky=ȭp/(kQJciBb}(*-IO /"/IfܣGM'y?;o 6җ#d6j}Or'R/E!?/Ō :Gr5žVD}I*l<ܛ-da}O${2dV}7ww%}P/EmkE::ArlK[*Jgb&v>>}Xa3Oe,՞LLJԖI|}lCAE7A^%j+HZ̝hH6S'>VZT6 H X$bPx%0FҢ} 1CIED7 qpG2+~G$X+ڤd0>+boTTyv#BZܴHQ﹟ E7:ys_K_dhhA1P Bkc)Zl {{Zgw_!ޗ$@Z0y#@A>%B)s(")ixmcupZ=5=uʁ8`!M6EDQoGf~Ds!qϸw@sI,2=@֠T>1ŒtJi6SjdF+{(Jk9(Z&l) 5Jъjq{ݾ;Oo1C#8Vt洡^<ɟՁx3L;w iIO4DaQkFc`4T:Y:. X""-FDaoOh9GUHG&· %ޅt.,<A\aMihXi kF bG{1cOE:Z㐷JmY:%AlX]nD0Cd I?jn;':vicB[ $r:4!ϥ 2ͨٗqxkK!Y()O!tDШ';WCZo!;XlA9<mP1)m LJ[IE ʢ''.x3Pp 80`8sx:jx,!SF$?IYea.`J})" <%gsW9ůg 0aznd0e*޻U5%x{MMk0gg%olF:ۿ(R_g 1tlvϝ[+*~?m2_ztp="n]\o5 n9+Bxh#؊[0!qJ_; 4<Z5S}Ol1uߺ܎ז ~g?lq{=ǸN7މwo m [@1Qy%{Ӆ^ٽ<wIZyWuMwmzVƘz*]S7> }:9`}MHg0>۾I9Y7xn|?~O_~BOO?(z0 WM$̭[I׻YWO?okAۺ5ozkͧ,6m>r}W~صC3Z cv~~C?[|<ݥڪ^&x N b~?>'ܭ:~2EMߖXjC 4ŅgU ifyX#- ?f8)u8o4+~m ic쏉; Y[u,`5^z%3v i1P s+ɔ42+aQ%XHũ;ttdO4xJĴ>d2IڻԢ4Qh &I$-[Qg9 j,P+ӡ o V:tfb櫐ցW{}qBRTPHDB"nC:pNz)W} s*2%ygcAض$P1,NB8$NdIr@{ZDS@,ѝ|6NZegX˯OPG \TaQmwF0.۟XNgF vَ6w?o U!eF;^PZJ4E`χ @vVO 8 ɵV$-Cϭӭt`1HhWRGB܌R=ӈs, 9"*Fhwwne޵$"q2 񘻠 6KsQq ?ygLX^rX39w#.DXPq*FjXz-$tUp}KLu'u>[>k=us]Akr蹜(0sMV千m/, 7 q|X0 pݮA LzZ`i&*0A"\!E0`O0<" ru}d*Bq`XeEZ6N6ɤ n300DdmqY'^OPO;%-/nNהY@RVIs5ax zL T5M Z,~v֯6̹8fO:eai"bmdI.rzMFmf1 Nf +՘MtUOc>OnXO3 u& tR[|F8zyV+?m)pBl>NXȖ+5(Ek18N :gP@DX &g F8kWwnu{+ %9VT~MG4?h'>r,(LkN)oXqW-7>%WĦ/G`Q[6Ik[9Ɣzz߳bĠT)ŠV)H!1BAZƠSV dmoE?*ҵC,uk64y򓦪<^o=|LāŒTm9GzI%wEMP )&Qn"72W {=P$iA8 b0j,*4R%ZBD@tTֶ!a9EƊ} 3`,g$ Ӕ52JX\1yކ5GI+OI;k ۃPl8@8>Y^YDWOzywJ.[lЯ%1:F_Ej#_k0Hǔ#gDDJiߍ" =4ή͡?h@'rXcpԙV "QG) L P{ÎX9 A(czfl 8 TR$NdIr.<Ş8bHt Q%f9WlFzт^`F7/Mx,YǩoXmM5 fA=3J򼝸ucuFA!5kQj%xGZN*ہy7j>Nݨ:0:0ҁͯ^(ZSS8-4%M H y\Лp'oZiE<牀3(Q{ԩOR+[ҧηast6J3H08skHd"tJDĂ)1ZJwv]mamz:=)LJmg{߇{}ʰ|-\oy3bs^U60,|-,|qW#1=LI92+@Exi966XcX|.D\9voбvO:lK=gH SO,SBZ=&2uc읮 S{˼Oh<ە3o\WU7/0ynz~k^\k~[Vqs2\y={sO}M9}/_]u^%Yj_G|#fo["07}Xg߂9ɕI1HdlH(o)xyIX 4 i%Y˴&cu1$1wu`s "N!ΆH}]vĜߡXal Z ذ3Jegʖ5gRO=-SMIeT6*8c)>ue\UV|mR뢮.P]19&g2ܜJ|.*S٩L)"ՕTʼHw*nm{4x!_6mX| l@I2HLoljj ؒZbY#R,9so4Lõ~LhBT1`5_ N-_xci:v\oQLhkL>9R&WPНyӛ|_\ЫµU zh}~V;a;٧MXd*C ` 8,; 2P7M c \64vwQf]'d%lT)Brg! B+<5mzF B%.*!vtb-ԦM+$iъ`WUIiWZ`J)kZQ֪5pUĵ-pEZkfl%Á+m+Xs* *Ҫo-RD\\UXUë(;:@PPi\ZJ5h \$莽}Ep%zy Z!K/hU/30iMp )i/+cmj\qC|PWEZHJXWW`"o* *ҾRä4+aF"*["q o \iqUR$+?-"R"i \}"֮CmZ*aWEZHuWWQhզ+͌Yj/"iWEJѭ]"\ ]ݞ+_älHoY\=\cTN꣜D1fo33gG#b)*f* V^pe + Tr>5Rz&J1hL6b՜A5_Ψ d}@>z~tnP1:+{\fPy1ќ4]%+֑jޅПWWxr eeHTUD 9CsLfmB𵪻*[V 8g:_RF+lf+w9=]R-9g[h&ӛ|)X.7Y,'kwx!Rq6{b~0995 J8k—ւYQyT)כ7T~bұc* '9auoY',E}wJ]طzl 涴!tZHNL|l~_^hBDoZε)-8ҿyŹ ]CrV{9[ř^4yo_1I $ڧeo2WI5uɯMkwifqo]xʤт@tL5KQJb rLis"` V)S3ync[ݎUتmiNʑ)1D!Yɡ8x ;P2 Xv4 Pq2 jŌ`$-*ȥn&%M' Yclg_ ܬ6]J?/m>wmX,>h"et5梎%A 'F2  !:h=d }, {"L }FtIU_R/Y237Yَ^:'}]1C&%?d催1LT."E3͗ wUYr&&UC2)C Cnf?<8uf [b}DFj/d&cXbq d %CV#,ι,yV>i@jj&%@MC H"io}ԖhYĦsSl澴}Z >:=0,7.i}{:Eȭ4ˆ(g.,k [LGv%WlިrBY>0yHH$4Ma6eFSH9BRB$=k"íO+Nǣ`m?̂LܣI{e)#)E˅)JD4lUrF Z0g3 {ە"2ϗagnT E6$ƖPo|S3qGV9bLQQ2'm;:'@nlݭ.:M}BkgAGJÒq(~^Y>G7=,z "릊lǽп8#s/?}8w?~_?Ǔw8`NɇѬ746NJ]tE?ygՇh[4- nѴbMzm5MvyCk\sk@Bw]Fbw&'t=kRT/0+$6淨mףc !8u_iy5Ƹ3F&dh8BQ=det0\l .wwR'6ƾq2"`A[fmnrVNW;I;S # !"Eg Ϙ:h-وuRrt&'JvAs,yYy:V" 0G )P 3璦$@)*ˡ "MF6*1x%ďيj}ӭN$5KZe\lYz1-| А`(HZ[UB|均e߫<5i_)`E<8綝<&y|g9G"X@Sn,KN[Y!1j`{n1! Ldxyx) n_f`_J0z^IzҰ7Mz1Qw woϣ~66?GC}mC{EvѾQowj޽P+T֝9eةmPtk9eyzNwJ_/PIy:Mҥ9S!2Yy*PI ۅsһ&^zDO@ c8 }4.qdzI#_[3QĀ!\ѻޚ[OB\^ZѾ -*rO %[SEݚ"${_ޖΪ+rE-HÚ,h|u\#}p&X.@T2%+k UTFRW+*}\y=G_.فܘ`3S 6p܃Yѡ>0:e%ȍ:+'],|"JY8 SكN+ô+QH4hWR.^OJTl:s:$ uaYF. O^tYݎTԔ_0Xܾl7vԞ&{ .P^bLBxi$F;=2j_Vr2]bi[P 2 vk_? (y`H{_ț|#hzhgYFNoz3>|=9ͬb1Jibb 悷y$`T",bY#Tb6ݿ]ϑl+tD&H߂BV 5'҉ڗ^1V}ͦOmi8dn"y]d)wF 3͂{/UAIR9eBbZƠSV dmȡ7? r f5['O<5 y&os{S=#~JVŏøi d1C7%Ž6X  qL9r6J KƋ]/c˺ږ$ J {y/([gZ7V _Я*U93Y5+?b"rP+4 ؾ}?^_l) .g>>}h/ 0[`G_[:jg[gve3XYxr<ҊHO :f\_/ Z‹/X"05f ԊmMڥ|g͗Pi0}^j ls:dpsduS|s' -s0AoÝ-`ӊy#'=6 gP";SQWPG[n.tGc%WW)5#5pn ̙CLDO,hn0K6`^;^;پjCK?rݟ>/>,3G+ F<9'Xm[fMcFՁSRPz*P`Q"EsͩM*&5 j3jQRjЩuv|҉Sw,rm:нlw;|eOY^)Q=FWw7_~߹nW3 .тb10L̷\G{^8+&1qdTSgti.1t䫽6qݷ@3^u?]P kjo6ąHv ;0ͳ#[w^-P_FgNreF15l >MVAV SR.$Y˴&c6DL,IE)1AbX'-DHӮ9!+8)GٝN,  ¶*B!Ř T&3dPj2CMf 'gfy6xazd0 QuC`u^  Ɇ#7xIdzA):&؟|W-D*6TښG$8h61A J4R(Y4[Qpi)C{\TZutY) .̓#JJu`ATr*z+J$84hrZ$ FJRۿ$ vZRI'<%XM0Tba2 4:ɤ]xv3Uvj]y-xmcߏkݟl5m[LRT$9 =AO9)7V28rsNR-N{M4m fO!?w9+Z슥A͉WT`;yB_JZCνQ |;eDqwjogފ4[K!zj9 Zr FҀ-$O.&yg8u=G֏hr{Ek!t賲4} n_kG^?ddRN Rr=`'#څh 0Ne"H4Z"KcKZ_{_`A>n7~:-mϲހv-GI^J7}8+뿻V,񲊛\oM '؀Qar?#vW_Q.N}qeWdn@ff逹fYW82(^ChyU0ҫɢtע$RKC{ji/-2A] Ή˭+{_7/ v9o٘ہЉ#WG.y']tAlxI]1E͞ ګd%p'1a?v3XfN&],Giϲ)L;eU%aQB;{(y$vA[(WE5p-8)<|Ax8;SF*uAjͩ3ʥD+uMU+VSdd 8{`N/T(Rko3UBO EզUg]BjR)BCkfЉ2;2}PÆGvܣuYj9i`[ !$V3=֮KdK >: kޘ jַ ԋ~|[]k[(W9X~D حj˞ȖU+sn6 (;jCi<#0`F!F$юcOՀe ǾsF6:ASD,5,Q/hF. fβv}?khf]D ;dOjPC"9V,`zMG4h'>r$(LkN)/AǽbJMb=׷uGl{v3=mc1N,KjB}ĴPUHP1^ q4tJܪ$"M"9T`C6㗛lriFM ;M Y^y%+̚ 3(/0N *bIȍL>&.l"%!N A#aJ8h͞E` {I#^B{P^ys**ƮcF⠠F1I*‚4e QFI{Pg)W*=uX"""Hh;H癱-iU]w 6`Cpc4e'h vgG0#``Yc!nDƆrMSk tTHHϟ埫[w[Nv~yYVM4iZp}2 -;?ʗ:Ei8O nÝ-uy>AE9 JTd:uպJzMXpyKеl+I=>XX^b"9x锈 &R0KZetwjV^L:B^v ~Q=sQm[VU;ǰ~1gaKi?W̹ͫg)WD/`r,Ҝs$=(ڤbm8[>:w&%I*4] i̧(H1u'\".mޒ$l#F|/ )ۻ,h>`";vymj%m,f^ɖww&~+eg;&LR{̖:\vCOGas}{-?U־ 9&%cpgpejg~A ;}SGc qrQvcM?rdLyN.(ē\ʄ 1a=36p*)c'2$9bO1Zks ~ D O xǗmzߜ}/Z\1-xBuhUCUO4j&* 06WW" !Kldhښ"qI{SD)JڒEldfQ{ILfRg?zl_"mjedK}fwԽ>-`U$<"ɝF\D\b ygJ:зUy `)rլV`Jծ<}I ˃el]sq ߧuh`1P)YH:0ņifu sf 3/3`t\899X*MM&̉LF7a@, g}gxVtB2$HQ~cp|g|@cў߹~;c}4c*d8!;N$5:A<@P&P .!LQ=!q=ko#7E/6Yd1@>d'M` >madɱq߯zX%Yږd3fXot@%H6ƢQôA)DPI2Id}22f4$i9Ni% {gFJ$CWXQsƒFl u}5`1k4yc0wMK7khk -?_Մ:8D%xiҞY$a鉦 Fa2{f2!$x=Ce[h>-_$Jɢ=Q$~BȨ D&0ZS1S+zI3/MCX6t&QXM4>֢sI^QY F0nvkϨV|XfV]2\-?迼o};[ۨi/)YZ*,YTDǂt4#BCyQDe"*zrșKmDvsj9mvG\QU2CĴAltX֐ecfUkQ;r47âX/ECzȬ֬ Ť8vjgǥkSPjC$Ld}%HPjCxWc1^Ey5 s4;52yqӜВ?Om(_.ǽ3Jf/*MdRy53-f=$uխR?ֵDM kPg1em,y8ǠS)G4{(:b@i8BjB IF:b"wU᣶GAz7RlINOIfb6e$t2+1X9#$%p&Xy꧱[7/zŬ{mCYbD^d)Xx?o^b[ap;("Rz3opr{dlL$qL릁ӈ 'Ĉ1" ,M>nٽ6,NY\7뺵 !O421}pʫo$ro/ɱoz"}nWDg?˿?Ͽt8{I00Jw_6 ?w@{SO4tjaS+'f^̻}̝Ʈn֚Cw4k>.gwZn1N~&6 u0yE ߗRLB1dC<Xr.ychMgB1A/-p(15Fۼ>}d,km%7o:dd$dA5d3~W9T3!+L`R rQGg{uEM2l޽'UmUz'J>3ANO;[؅XH)FWۺiT}NW׋¤֕[=4"lJ2O1tM^9.+9G1&}4ĸY%6*fAlT蹎 7юȺㅃ].ݙ]9Dn/Zv)Cu\[1qE!S}{,~\ 3꺭Ocm5k7(w tw;ށ*nx7_߃ןˋ_.sc^Zm7"[kVn~" #*Uh DpQKև^ @iy[ B)VG$ɥW\E\h*P V\qURw9"qUhUw.AK%@+ޠ G$ F\pT &h%CWJZqŕ=W nJ|߿oMК/h:ؤ-mLd' 7r2iHc:R2-& ^,e;t_Z'c맊x8j h ^UTI\e8/JsVmКKM6 #$2hUx2<;w9*N,傫B+}ZЛ rU.*u∜I6q)gEk}|˜b dEZ~ V+CxINmPȓQr2:Yol (TDBЖVwC!?; {at3ɋ 9 .e"F7X\!&Odʄk# ? cKf2\NW:oCeiV 5ѧeL3% eBD sduEH{v3(kEJ:.̎(4x(yPG_QNٓ#'k5d̾ԦqO.$:NGa6..8pVy"NGH@IWKG鑑-툥Mn'FF<LhB\B(s!X{@&CH <[ZBNOȪNHW)SI7U3DڲͪWVc&묻$MO@P%@yV!{+Pdh:w 2 0xzMVYKRu¸&K&'LrmZj2@s2c@&w8_%x=8653"||NpDɶS }&j|!|J}e蜕!Ѭ`bd΃N ZxeFwx̾jd??[֭0x+v!}ۦ'ߔB7T&?M0gf'yH[] NՅ^Xֵw"mf>,_\.#s=3,,$oЫC/*AY,1,B,7$gD&2qd& گj$fe2Ѷ|z];v5\qS4w:^;]1vW}볻xi]D4 g+|0n޹iL`\%EJ9'*#Z؜*$N1^ q,tJ`U Jn)`2|~-zG/W9}c_YN'~:O~@eA2d@- pwqKpb&hF&JwlNtZqXRYﴠ)D͞G"ED]L IDGem xWcFSBQ&IEyPRf$eTG8@q(x<hiuMN;hSoG޽|W /{5/8?~FhPyuc)E=2I8,*ʣV@o6`_g/Ԑ/d!q(3{8tmlb?I^L_h^TlIzT]zMogw~Gj~7?Wo5z6Q/{!5Q/+ϣ\fK/5zk`+ NK=J <SsyZ)B"T'rEeNEݙl%5!6JP %&M@khy:%* *7F+Bg>,lW` 'iI/&!{(ÂaխbΦm(@w,SC-HI9(Ʒ^1x`+ ٤"tK>Z'6Ea.H…hST|'wI✻ !U9M3@ȹMHOnN"-MtZȮ m!-=8a[Sͼ:r%ɍXIpM畧A{ɼyg"ne J;Ձ}@N TmeߌZK&b_`s*E (dcSF nA28%x$9prs ھ۹sxL0B*^ބAE6Us^Ʌr"7Se@R]]RQvce{P=BiVnWZ3nFS-f쓲*֑FBX4L|r4jZy!yDP!cDS^jhA|59#Ѹ!0XCrXlـ g#(C&B<0X:bk><>ZDYRKIfN淟Ś|kl9,֗JqlXǹ >T"zP$ $G!l AEЌ#&Mc7RjHLˎsqnu"ٱ)ANT")|4")!CQF.W9e5%ɢ.LB#ՐÇ02KYN:Ftkxa>+6 ?^Մ::]Gx%xi!MK93&(kIj LBɹC4 d^ rz>\E%Lp=!<* *Xe0)@{"}A-?tvRg'-weN nSJ;ԙ Q#7Z=\`*#&)9%L*R2zkG1ӫg=U/Bɪ2oFS=!kYav< Ғ&i aPG$4,CA@AsخV=ιK+F"Oe>Z6AԻ̥ W#6*lqK(-\46D0a4 ;( 0VH|'QB0w/ z5ɊAa|Yۥ!(5(.dW&F1츏;Ύjvq;ʎQ.݇qUKqcBshP:I v:V8nG=-ud;"5KŨI4bRڠ,+m%3'}.lAy7Z:\h&v#eApV;E!pA!J;J gsE :ߧ1N[V "|R y(؈lH[" #Ơ$P{[ Dy`K*? f*|h(Ξ )AQf B@B7ZQ#EF  ˏ5A!alna< Yw'ۄđhSJ\XPc0+Б H" 3'CߩRM3\\y.dh3@RYaxz0F%䯎2ғjJ᭭&Ć5X`{#?kq#CimUW SZskioU%Iiw֟23L.y}0޿o\wjWWB 7<&qzw쮟dGQ@F ]S%9$^dB(L57QY@J8j=!HtBye[1\I] 7\I}nyxz(yOndC =y]Zf gY{$zxzqy|4ry<LFOa;LqM6^8C'7PPzity<·WWӳمl` #7|aӠz-77s(BR~5O?֓ =XE[7慖#2bQ-p,fM>=џy9T[+#{d[m9W7<=ɇGrM_MP~^;G7jNߑcu@5*&46{yx?|_o>PO_4mmK!s^ͻT߼ka%)ֳ ߤ_][]>Fcm/tk@}WuGxzQoV=Vq?,a#_4{CIϚGDݬш!‹EdlcvU.fpWn4 y[6 ?is?k+ Vsai/a'gߍFÑOJ_Q'%"$d1Xca w(Ÿɔ42+Q%ȒQޑ|SsPM$-x$&uwK9l7 HJRAJ6lBb4JW⃖EڑAء#8Q-,&||}\P'Y9*Y4Ihp흳UԿnX ,p/;I@ZX(@7hdqq,<҂F0Hf;fsqǥI6t{y~g_*f:T~دf>IP@^B7uM0hZOӁ;x+>^f\q[cݘD=;DڽzxY:g7r(@~>|3et5rFM|5ZooO?.P1HLtP )K$)P5rL޺3.2q/XVgf|r9:E](r*^Q${Aw!Z橏\kҡѾefK]ƺ(L/oLHD迵;^`>SG9-nlKTKZsD E)CP!ǂAV R6!ȡ]{kl||yIu:ljoa7ol@e]qg pYS,.$LP SMb`"|L+]ٞ+BXRYﴠ1 S 5{)D%L{6NjĺD l"6ϼ9:"M PA85'; X@ry QFI{Tg(nϜkvH!Hhl4,jUuV{g%v%w1'!r(CYl7ȝy~I䲆Ha*E0U]X4JPԩr* !NxT\1+Tc U%B1Z$e8Qͫ:X˙/g*ZX_X-9U*[ ˃hsm[D~ P]U>:D*Bd>k^trB2NPMw&(AsUbUIPyQEN -A5B6B*uܯ`df9a ZRf/jG]u*J;D*^R2\0brK)^{2\+^1WdO_L8&RW-g0E#Bg]D2qehr}D)Ϩ4VĒ0pDXA A&xPMڅhp`2Qf$o@'ڢkPY{_u0'AmW1'hMAҿFz^pv&Z~oz):v0= s/p,qs0ϷEd ;-Κ" anJF ,OyZ8b4eExjnpQcֿ;dn@fe!W'V#1a- Wz;YZ7js^  ,$mKf)B5->;'.8&6[,8#)Y|V&T##ݲ1^B#>'jdǜcsm/_6/ԲAͨ佣b2o3cJDϪe8[n4 )vʪJK:/vi_/վffSHZ@y5ZԤ;q8x 9N F.Oje[8{tްd.H g %"Zi1E].JpL>5w6A8 mZ *-p|<6k'Z/6=:RkBz݅ڟ(ԞSرB[u7X)@,%̍%r+Vw(RJR(Ϻj]sf*ϨvUX]t,swk!-͸NZ4,pSA AcƂ 5M[Fvٔhg1QZ~[6Z+8nWevv]OdJ¨8X^y4SӦhiRiJ3S݄/+RCR U_S%&r#Tԝ:*!D.HTl% @ <XC#w&@iTkj~A!T nV<:wVGnWP'>mIGO)={}eslr =$|J3 h b 4})q}Jh~HBTqtirPyցGYAPIJ,0TJugdY̛;ßW1HL451sD D69k5IObQӬhМ4LP -8qN{y7FccdM2?V2ɖm|2tkS/,=?ݍZ;l^>ilhqۡs惢FO]".mԚWWl#g'W S6ӕj6`~S,Mj.Om{4D|楑a<mo~uwr[C{k:k`/;8kiy 3n^FS-*RHpHDFQ&0 ._ATs cpFpf唗Z)}6wxONy49bS,T9TpI>ʥ~ft?mλc{O'Bq1>yb2B"zP$ $G!l AEЌ#pҦ1hxHs Ĵ[ [ PήL$ eI6|4")!(s9[8C:·!YBeҒdQs& Hȑjgx02KYr͚ENqVl9{fzb[5W;>ߙ,Ҵm 3c/, LBs!@I<o|>r-(D5*|ZZ;JsK,2=SZ~:NZMr#RڡT5r= L%c$%AEBFo(8Ϊd1xQIYd|ߥրӎSԴ)d%T!,h&xI'"R 284ИGGv?U }J#mK !\ "X|4b Fh˱E,hCF# ⼃G9x,EL90:[tO`vrCz"㹗ZFԊD?$t&' 6.× 6B帩8^P:I v:V8[PV+^D]u$[:_/x?D 6Z(F ID6~ BJ[D ƣbT$6Υ\h&r)c$pPq6GZ8G(D t)v^j9{& /\n[^ʌb(؈R$i-AN8 1$ǟa@L4CoDCAc;$ohj)Qf B@B7Z1% gocMPH7qgj=XOnxUiDHt4Ω`%XPc0+Б/" 3'C߉cOg\jPQ+CL c- Eţփ1*!: ҋ| obo^x\!Ֆ \-Lh崯\LqK0>샙3_].+)!B<[|85c2&\Q@vNF ]ݑ%9_$޺dL*L\ͽ4, e%ol$F:GcW){Q'UP{+Ԍ͆g7#X9s}selb #O|kiп<[nvQ6~Zgel@H۝iwm k @pT(YFOٿO<gl!sIN:s $>O}?Az#~Mho4J]c@M*&o6Y8H/>o?ӿ?'}P w`}}n[nt [_`nߚk֒|f.-$k{}7 O|n8?OӓD̮濂FΊ'=LQ뫒D=jQx"^(7ecMΖթjRH~;O>{1ԟQGO@ Vsai/Lh'+ߔ޶Q'%"$d1Xca w(¯ɔ42+Q%ȒQwlȜ?axJĵd ,qcw%mʫ 14N*,RMvh iJOpIsNeKqn|:AJHYJNۊʠ('$^Oz嗤njk[]ފ.DPp3]*ZJJߥRyX<"xXSFDh?7(MV7 R3Mýu􈮛|r-P[BR" 95b/ Lbt< 9DWZc*u.z[ZveS/+”85Ŀ 6=cpvc Ÿ*L?`<۝<>-DRPT"Pd dtSߛ[ix6|`Y.K1\iA+J8L9W M@T.q ] =5|Zl>y&w+hqimaXL,|{<t윮\i&K ]a&{!A u .Y`qQVJ $m%B*.QITWv,,sY1F2/ezZ/-|¤AT)Q>Bf13,J($G'*O֠խR+a]z`y$jb3zN wX;PF1[ƨTD ߄^9u@A MBO1_j5ɍs郥=9=Lm(bJS7S ZVA\k/T^ AM=oH]UPlJ]UruU@+z 'WکJ yW5෢*_Tj;TWj+M;8VćoϙmO$i??f\Z^ׂkx t%x\Be1Vhm jcM>#b]0rQ¢F.T}tnCIheeMV&DdURA(E5*uCa0pi鱴(Ag|pp0ЀUBDk·y]B kްARmo+,I|qt@"l.xҬ,'\Rڧ I)AԶorNܓ* ($d8"LMYt9YAׂZdWB@9Dd3 W3g5ְS\Aؓ:I긪 m;Tm"WζCV].io;mn_F;vls:q0%10m-`=x,!dq5lCw23'MN)or2DLta}9rp.: cG7ET: 脑; %ɨҤppqC+ryytTSqKCG^~I4 W]_&H ';׼Y {qf|D(qFPK+X@{9A%mwYRO8a%" Yn4C.R+loB2YhM L/4Mhօiq ;_ _]j֍ʑ1?2Gk`K3**=ɰH242QpeىT?TLL' *I4:H:(4$E7#/yWJNJOIǝr2@(:}jW Eӛ3.p !qqK]/*o9 i0;6hXks-[9:|1L19wEħA41IǻqT 'N+a0wB&tf4'6YN vbxKrZ6琳E:c  B IUXxտ0)ѥ$_,×qLlR zY[ CZ|̚u][AF3 J+@ .fgkЗwGٙM_]-,)T ^zf6ZvXMْRANR..%$1iGBN6'# BhX;ԥHZRT[ڱk78ln=9#6e$WSꩆ$BJV@'[1 >7%£u-tF@oHŠ66DtZydLU$L)hd.sSщGr M`2e ɘwscԬA _*Y &FX]J*XQb2cKZ.i܎uPQ].*OG{ jtvԓУyngVˀK`aY8T Sz?uE**Q}wI"Uʠm .$W u%D(J*ű'kJarٚX / pn//]/b=Iltr.\LG͟ %2݅OcZԬMMbf]'D`Q H%VΑ9^B mٶ5߳fi |Q^3WU#6x[(=<*.jxysM[W_Vim-|MJMe1:FVً:)bF`%R]ʥrFhQ=CFВDWѢR)m|3ВW~[Nِ[?e' "\4(r11A1T20#[:&s gkP[YQL,h,+lTPw @tʶ C,P b7޽/DkYŅLJa >VQD&$HPvm"`-.}v:Yr>w Xb6f};v3iQN~~)G%gr("z<>W^!]/B T@o#6* A~~iHg9}cZAE\"d/AM:XS ?Wu7 ?^ k]UxӀl{Mج]X[>%?,~>6?|fEtopk].4w- s#C>^XPY(6LiKkZjJ,m mL:]`ڞkkr[]my!lsi6JaϤ}4}tL ?XMqhnNV;\s']0\Ķ4/[!X}yi9I |Z2U8 ?ha7x6uVu%wҽz_{ZvzQӞq868i7Vg^fY'H+ ƐD[]qddIk'PZgQڔWE;6߆pY;y~CuխA}*JJZs:0ڢ~ KhW˜FA{)}1뒡!bY@agѳ"uh5oMj'c.TG X]J5qT^d>A$ӥVTr0Zy*#Laj'ǽ5ma:{9%z] cFJXo!"}/~+~^Y /g<*.8k 8k_4hhDelgЀ320Ra)%@o댴'>Wq3fw]oy?FɍOұ@QJTA`pkgT̡ˌ ]"r^ҌWx w lysx\]8ﴕu:OBh]0R(QKX$] !;ؿ2ЗCxX/@?OZY俿!LjJE ΰꮪQ0M)8FҎ4ʂ$tkYyS0}jO@l YnhF`|w C+B e0oH,D&u w\9o&a?fyPK@d2pL*"03,*͸JB0  BwH~pǂq?N(VhgG ,n Z¢!xs@@ILb4;2M*JppPB3))4\k,I1e: RgS4I8IvȺvɎ |jmQ˭"'RWY=Ig^J0=AIA|ޏw#:̵zW}Wo^߿\|uy7/|j0MAe+683_.֟SSjL-P^g^䒗̻@|bmMlk!@~ގWKWl݌Aӟt4ӕug~ldP $Tp׾P! |e“rY1VHk!0%w*Ƙ1S q쌾X./=AF=iIM/jW?u:!G\0'4x҄9M-BRK'7"FEpiH=X?ax?p󕥀*0ɤՄ BQƔ88HBb ^h2Р J>B5rWʍ7QV-:eY-VoŔdhgm}Anڀ9kwUYW^rP/Z|Ԝdsk,Qp2}쟸Mę@o8'BD18ȹ]Ȱ, .;A[R.k%/_" maojYj/VpՎ/jhd'Gm 9v qptiM[Aaq)$"_3do!6ra${(^=J6egunsR *U>i["(Aw!^bjof'E+h5^G^kqZu!Z\ 9Zr;Afxd9 VGվ>O51a׎3ī3J-odhѮ\vu."ȗ̱v7OVƃvnTz}Jy[0Hsbw"|mԝk:lE{/x.\< ө ׁ8?x&41&H6!'V dI{^q%SO@4{MW[a~0tQ{&ک0@Y,5#ٟd?\d/1? 3_`CC:dX ff0n3vƧMyI?;6Y~t`ꉪ&/kdHZt!9SSDD@ 5^a4UTsc%b@9NxaMbMrPpffC{ a)єi"]J; /2/KI1,|IVFA'0'7f~--м%<]AG?ڙ@;vR!E@ Kd68P!rnIl=9/+>(\V پ7Kt!<[$ի D-ߏxx6林RMu+?*Y9y")n"ue`/.\"ͼݨ\AA!+Z[9h'toϔ$_Zr_%b* ےik CLXןg'ې۱B:Zxݓ Vk-*q"XAd<ÙMd%0,Tɔ*"uR%oi` ^u,J 5/M"}2V瓌Z/;OLx̩=XvrW3߇P_h-z6YOJ-Ql4x9`qoVZ|-Jl/?9;YӒY5DjCх@oLJZu|.sJ bW9Kh.>H4D \%Mz_bH4A1}(R)G`tʋu gu%L'+|0\ 8[^yؐV 4/LT )°-]} >*ajSbo_6JՀ~Qo ]3CIQț=i5aZu۰){06io,;l,&[|?ݾ]3̑{wpjV̫Fb'gv=Njy(!6 a!vCbs`$hg>T]%(gKBɨ+P,9uPUvh`O@:'$Ai[uK|8 \r0`vׇ۫8zWOQ])L$?+XJ\JPl_RWz'ȇg0&S'G $$ L>IK]m V\rʵ6!J/oAEMz~ u IO aY-!Uwߛ4٨$YcQB`}R{B&ܦ'JXQo5&6ڻI+PR2¤ch m-2I`ci9I1u[Bҡ{ӪnM6j1\}iE|Ih`Srk@YD"ҊK! ~r1wRn$$a. #,t`z: m5'{|,F4#0;wQF!oDQ!gv2AEZwEԃw7G$Gkd\|ܵG/73IT@%E 2e8&IP fi%!!?{Wq @mÀ?8k'À _b,hQ̡+T8$%5/,:8:Ld P@V < ^*5C^aS8֣;"1`qu4Ω`%X'$`S$# ^ 4$n)Y۱Q?LӡLN%,S>WHSJRYa\ bQ01*|uTV^,bT+q wY\,&H=aj \>6p>SE$&欿Ƀ0Tǿ~c0Uᅮzr^a3`c[q&s+ww )L & Ӌ{|q$c [U .1(ۋ806YqzXկm LJ:'XX^}\D:[CS1˛Ի"iV Vo|&GHƥ;[NhP4=(LHݢg &ԯgjF&s~u?M|__a||a!P<_[?LdmBXmøng;ï9mwN\mmhm؆ZdAC: V0bwt=cTޕw>d]۞12]q*#w2CsO$w7ibI94dpr_?÷?~ ?ǟΟaуah#aa5_KOo"e ~ւ5ozkS7 }uGns];}?1k ÷Omz'o:ͭB#; 0 vr _|3EMWX߂ G!^uݷrY^16Hhe:b ]J=۩'p8_o~zlK~VD9 8xl Ẍ́W5X?u>97ǔlNJ&( 02+.LI3/BU4I9uL O<_YIpp,11R&)3V{r\4*We`QmBل:|bTz_/"e˷*Sj6}nǣǾKjUFP ج793* &+j|E3wE"~㸸vk5,3`<jJ)+" X ENry4ohhhht>hUFx9 |pR&5g' P PV^4|iMmXS;c@L'-u#gw2PX(!և  K(lvnP!1|MGJoj;tt$t%)je&gGO /qd= IqM#T&32v.G]v'Fn ͚d DkՎ0 Dx㉎F&("8J#(AT$5YŰV18.a۳^^qn1j w&۞'X0\Br2 E(ԡ#^QoV Obp[k0Hǔ#gDFLQ=x*Cxׇw %ųHw"9)8fQ1EPp:J)< 9('e` ;gQt 4)E7c 3jFA<.8z.G1KBL#4s!޶C^ a~F;ؔ;̧@qrxt?(l;q2Jps Vt s٤G t*u0[&m6CX*GZWoW/q^~zZ}EFOqøl?w-5,roWmy۫ow-2>V4|sH6c6yQ[6>FSAMRPf[U1[\C[Er!XsqI $' Y [Gs39##`^vIJ| y:r=/ْ)CXBK5 )#@ ܎ܵ)UU*"(h(W\(ppx|^!,N4 q:G-f>\3_׫򗩀oUxwܕY}_G+o]^7eAmq0roZgCϥ6<:U`tY'7ܜ-x`;Y`M.c8_oY[?[-Z0b4M"&r!ߝdN0;<`x֝ <03^^6L-,*av)FԋLjiw y_@d!k[Ben=.*wZJtIPqu~IY%UO8ކ{Kɜ+'ж1[(j Z.zyv6̹)2U.HYl# atL`RcX32o$Z5!s&bbIb.*N' >i%*Djvt{e}0v{m~i  mHP@łW[YYr$ٛͯGKZIebC`[։\*ۯ*ۯ*ۯ^#SV3>S3[3j[ϢmPZR$ ON8mU>16u)D3$m[p0ˉy.odVJKTJKTJKXJmP /畛gR /R /R /RVyqRZ|c5Z4Kԇv<^|>%cgT EC_pxcƑs5XQ.u\VIX%U}pb E d2&2*D$͝_{sgsvؽ cL[E<^7\7CS{7\S:;{B=0 O2&PeҪɂ:$z,O,ePւ"a2fRi62nu4[mMr%AT)%  T؂YBIPWoѯ)&EyE,OY̼P`#Ջ9M/K9M ~^z&rF J $2eFHdӐMՊHV?"xf4IX u*|$,+kV]4#u_b<o (&$ y<8,*>IE-D^^ֆ\:τp) +8Μ61)Qt)xit^Js=?R3} eӖ}R m(؈fH[p\w䝩ޓ R>b!$t?h|nk̷T$&1R2,ϕŽB@I. Qhd"[B**Uܿ;/c=y⹋'.Cw\G V%1qJc 2Ec:M$qdmUKtд?e3e>|F< 2Sh% }W%dko鶛JԉOnO5~/>;:G1$l+¿%|5J ͆f0$%z:ỌK>qט~mBc:?]?;l]ɋNQn]]C'~F3uT[JMh]F T>b5GZFk2j)u8oivv`_OM cLgmEԑ@'F LxeOC;q0 JϔvR'%DH`1Xc s(¿)ie4VʢJ85;vudpЬx|e%1Ebr 0e*0 X]<( +RAOe`Qc\,B_N>KQ/[ЙeViV`k8ӽ:⨱=|XWA t/F6\[g4$Nz⸩g>@{vn π@N*jH @ 0aKx3Bͼݡ#w@\mUNb}DbuKb:Idq|)${'VJS;INS1Y >lA#k+7B\SʊyjպᜈEm'ZD(ԤP&mIhUFx9 |pR&5g'Mu`„HԅGWs9>M`nɱ¶~z4k%KãmM7Dܣ&s'JѤ!l&x/Q&>YA6Rg 4FCl4$4cF`v&hO!$ &PV;0P፧:&5.pEphiFATdJhU 0f>cV*:v39 >Cw?_ʝwb~er]Dy9 sDžI[a,|MS=>w6c٣DL/9^A$*=LSs`ګ>N1&za<ۊ|f<( 5 "4Jk VD܊ 6}24*pL0ɏ9;X q蠔u9_}?_LJ ^h||m@-WYmr27ɻwS𹿁|e_]dS^HϽjM^#ą0TDJ+:ҔAr~Ip>DG6PhDPEcf&@I*,Ap+EOVxI&ŭEWYIW1@cC1/Q ₁l"l< YN/M-&54yQbŞp!(qve}s>S^m͜5n?ua3foV6~7qل̂/vS,`5 1WY5 Ś%0[Wlʹ\A"pQceej0Og+u vuybêFqdڀ DˁiJJJI!4l ?ABѶ6=[(oFW/wN^n^=:7(+SH.ie47a k;сe:scq{Dp,xY=9ܓvؽ9"~qL 4q𚰛NY)Q{{˖fx:`{'NY%IE_Eلmv=6yé/ )%)U;TiBR fvi!P幑6HP AInQ-w&,nF=8Oݾ㬟Ə]V/( vrIKɑDbg9gІ1!cLШi)gweH<h  (䪮^`*%Kuئۜ3)Ɍ`7D]Ccսf@窬{Uvnٮg]heLs4Zfo9ZYKoVV)ukuWݾy!k>j83t˺SEuJͿk܌pP }5 n~RBy@;BW:/7Ja\!OW,+sAN,vp;zXSf|Zޓ0)2,w)a.F\f>w*h6߹o:?}y?1/U3N0DŤ'L^D"i#Aa0XҨɏx|0mOD4az{ò35EJVmMF!0|YrAlLG`vAOhsoj\U]Yb{!fư>4 iΏFG7Iv2@ZVXGdB"6QCЀF84). p 1e!6l&}G6>8"&eʁPs3abQ$lԤk H :kp#oNcǜiP]ֆK4 ra!ZH:ind1`7jhiӎioÎ{kU EG?=}G-L!OJqkxzݿB#ڤ}Hȉ1\g70]Iwdq[BαDKyfy 9A g2iWBTb:AQJxQ&U5_R* YV*luT,6?hd_m9$InMB_R|yJN;JJMwj7bO:M11 a\LvryS6@,3vJ} Sҹr缵S]Hw4XE>Dwi}dݥ G-zT,J9bImAȖ[ʡ% - *Jg_kexWoﮤm ]uOd{Cg/< (PPSL閺0pK]o-uR[zx[BKCh-ɧ@Ӳ-u7k -uqZ㴖R[zK]1sR[zK]o-u7Ybrs x)x?iotN(~馷:4lr7(CHɰsFA%u~x! 9-1EUKU4by GKgO7cCGO(.M|3?K 6tЫ{|,+:;,]tui^g7uW?732>^}+ЭO󜔲!~]NTѱKZ`'}ET R Rhj\?y6PJ@,ӶeL)9 Jb@ŀɊUgs0QpYd$ULyżF"^Yg^N+n K̐`΁]bhZy3#`QXiUm9V$'R/U= FX:q2ycyKd_O/'7Jg䬁3f% 8e0 ` !h1ƼUw@owgzxH֖0!{*)4ap^k(vy6\M(Q6shY#Lv oe''O߿6C~+]}KN~n_o޿O떫 ׭v=jW=iF|ѯL3I++rR$YY֖4~ޞ< ˕.uD.$ѽ8tnRm;wor[Pҽ"i^6 ~u~/j>Ý_/XݫE54s3}տJDWDn|銨dbH\\ 8_>^;s8 ]ݫ051aCVCGXu ,i$/_>f9t.BIa2hr;ߗ_}uϟyk|H4gGˇ$&j0\M_ }Wm2 Qk6 ; -'UM/.ԆPR^̠3eim Ygd}7ow=G#"=>ʤ;@~zZ vq)!EB(c51:22dFA,DdҒ5=:o%)B)S* lܥ }$.2̙ Qr: '"_V[΁C Yڗ}O󷟳xC5gA >ufܹyVY]w NeП ۝DO3HNP Eέk2apJ\`Q8mɕ\ǘˆsH'ՖЋ9XJJ$~ɴ-^ЛؑN9̃2Gik[dq5_!YnzqT cA)F"R>!VHbL8RtZے Սsqg`Ϯe tўHS(uP@ #zZdLՐeOX 'BN8}cMdeEVJن:$ɼr䳞|l(PK!,w @4i943[?,\&)e Ү@M^3?$DwBT#(# ؠV5gf'5;iɼ]qM:t!(5m0)I(3sN;"8.Z5xQ5Yf|!j:rJ@HKG#$JgfDhJ"B$j42ȘK#R#Bk@=F"/e>^|X"Z!BO<4D>,%[NČU!9`SB9K"Hc Ɩ(!*sNiQяӐK]nFD ?Y$`3Hl*qSirk(=޲. ;lɦ# T[P+I]}$[c7=n8 M~|@2A”u?\Ec*EYA@Vx{uѹQ*+TC2XdAJ2033:PZ΁K:$W3=)l2DleX,zF )YH? el$/_u,!Pin' wYj2N”( ѐ`![*:k%H!*`h{9"\~ si2{6f=yP]PYɅ`kz,e<%t\P$1 (H'}#86t*tP|0QjA9BBf5ZL{޶ /m;n 8NO+Yr%9DI%EXNh vDRyf=7F%䯖ebM'cNĖX }<ߣC-P (8Go.wqR F1QDIR|f10gPfO\;U)! wց?/Vj/xjtrp%\;%wx/]3ľ)g꿹HYG'$:N(OTlkT9]7SQL29v m UEǛVd%}$:99\^/_*d\I̭:~pw i2(yUHuTE9\4{>Ì 2zr1plY~ Rb[f] ! ^OSv0Gn?qB6$ƑX?a`0l˄ 1QŬɇMbeoer QlQ7͕1<=·Nc#y`p(R#ǴCT^鶗JO_Lowuͯoκ/O_z=W/Ϻt٫uA Lg7}A^?}hM C3wZ&g=C.&a5rh{~[+bx{ }#qSoV?yi5+e+A(B󎏟U$QU!m Mqy7nܬ}>mkXٲ9US*8ɫS4 z6Fm@ vMc&gmEԑ@'Fw#;x3^}+۩"$d19BQ^_dJy"wtxLx"1mG2\f.<(@ V4*We`QnB5 uPʍVԓϭZtβZcVoe_[͊v32tl+/G!@բ/UMө#ώ~+:!*zxmn0Wi~QX\{+T><~~^Т˭J/)mx9Fʼڡ–cœ>IZ+K[[O<5ɨP7YZ>k18zEEBK/m"P:xnfhY:fu j:8&5f) !RDV*\5K$8ۨ Rr=L1v!Ă2hZjXmQ옆-qk/xsH?5ըaiW-6{av*J{=>Ըsz?F2>G 3^aI&*\LZ>YЂq'#KK4Pvh(Xj#VG#$KڃF8BTx2ȅ5i^kg}|եM{OViZ)6gHb C9fV^?|Ԁ' I2y,%w"%JTW3%Za'@䃵Vb]Ffb}&<^~C$NC뽖ն ~*{ #ǠYG :Fܓ)" HAbfFXg] 82kuhK'NNAJ5yAۺ0(Z'ƈiLcW˧ƈ4O(94 %6+Lc\-/+dU^*C+ nJq%yn>-;p堕C磛OQSNOCZA(XjIhƌ)9-]̓;/-CP5Mi.5")sp!D1ᓴJ')IE=㴊C`yu)E<:Β*ZwAaB'LzIhNt1*-Xّ _-iZ"c`LP֒Մ'~&$cCv Bd_ r{ȴvs㻽VH@Rg5)2cv:BSOkPPnWU꺈=ZjCSȟ0>DmP|MD6h~ BJ[ϠTrIkmf6:@~{6ak'V$M% %pGA=Ix5YHf <& Dxq9"P35ɛ$ZJFP9sAqP0Pny҆ oBg9"Zd(OrAw"\`+ⴈX>'; bGwj)?e3%)GV2Sh%˜\@NYύQ efigSkx+ Bo?C77~~6hQ]ajd9".N_U И($)M}~(3Iߧ?GwC.*B;`c+ds}Pޡ"@ e:WNI(;K׿'(/;oٽFuf9RVQ FbU-p%UtnA|:8xTUFJ;o6Pb"uMqe+A2bYiJO./T2.LOS8dS4OmF]q*$ SjZR4{>Ì 2zr1pl&Ksi?{w1-3 .o'q ;#8!FdH\r0qeBr`X(`bAFx1f729b@ZOrӨS Ӭ<@a8c_cڡpzuEU/tK%Ŀ/ ߻@r7gݗ/^̿Ϟw{g] {is: &ѣbla ˽<}}CxLY˸)ow Ʈڞ֊;/^|>/zc?76h7ZsZv: ;E`#_NKnƅU$QU!m Mqy7nܬ}>mk&`MjesTpWzicm>>j.»6ƾMڊ#gO@ V3ᕁ6Gv|M+۩"$d19BQ^_dJy"wtxLx"1mG2Yjb&K9. 4) UX%PlBTrc4+sV&؀m[ Anex2]24 Dh7 Pmn}]d4_N!=IbUEIA ,][Jg$o'=qܴ%פӁg|֝j3wQ^'NB$ B=x@w&  Q_{ ݍG!@9AD$Q ST%hؗk OrRZWڪJSSRRRh?l ?V Kb{?y [-Ґ7Q\D\%,ɈiULdX;sƫ;KS` .hP`U2ރĵP6 WCAAܣq*Z#lEB P-]x^Zyd:gAbp5`F76TFAՖszo 5ث@ W-,] $ QэБBHWS:7Z5UPf7V^J+w`yFiAl,2D6Y0#UY+$=e/rkUN .r<л@!2=x` ̕!bvyHн 91mW ˇn_(Զ?p?μ:pj^6Y`MjE*ixUOeŲyxLSN-K*E B7>k$EQH]$hUrj`RRnKسKTR-#7/Fs>{vɥE\濾{ҨT/&9+L=ht{;G}=ųac6Z־N|3?u .[  [<# 7;JR(s1HPv=OoRV fwiA RNs& JNaRbT]{T]mguDJfLT3e {Cf x]7 d#"r9qKL $Rd,j9"8}|;Lq_t.Z7NԞ)C)鼖H8\6 VQ%z ;w_r$݌8[(v+فo[|s r񍤟\@mY lzIСե4d)Zn(<7EH9+\ڨq7~^ǎ@_?{5%tGa4+[)˗|&eg@i#RL>K4NЙ ~L"ӑYOם3˫u^ː[[W3!\ja2})x'Ρ6~ ?lز7Mo7$䵂g$kgmtgUFfٳ"-:7ܡM1bN1_EMCݍ)ŕ`X'zYvL{DTUWzlQgڰ>_ ,EV5l\_ ݬU7+7$+*,.];鶋sMfGs_Rv /)Np?YUsh  +e;(q;^~a?ٲS*m8번褡MقmvwPZӠYqVJTȍz93)D'ւP*0)@khc֓@1{T8eyE#Y:t"1S4?.4dM]hB>~E4x[5,"¹zݨ7ڃjTe{I U vs e ƻҤ}g9"K12iVfp e A3DBg{\3lG͂R&q8+:U7^ٴOR<~춳DY^QZ.z:[(8 ]bNFet}NgVx<[6s-DQײ%}e2UU}[z8nibWEvRߺ v][a]\-77z*TGWГ;.ƶozj}Lu=]cqE翰B~GgcљRuYw=ȵNZ~XI\:H_}|,EOP^$yoBB)K6$᪟9x%Ng;n_[ߓە0)2,w)a.F\f>w*h^g]q d~xz<*߁'ӁkbRJzxyɋc@"mTZ:( 0k^7v=}m5X9Ҥ#?{;=NbQ2Ό)GiL;kJRo,PQӟ}z雷0/h%zGiL=d=9_}&U[s*_a\S2]ڜk9W3)|,~i'zg1K{&bǰ:Y\}:??_c@m2 U.Hmph2Sf]8c,[=h^iXV@)BQ&CʄE-Q%%7|Ex_C~4k:tYΘ6@5)emD"E F&#YFz!ɏ6Mk4풚5Jq~oFU":h"޳NJnQBUdv?BcI56q 'ZǴrtt'%NQ!3or.E@Ι`A 8H#200E֩E DRs2\~I|'dY`q.PQwHK  6rCIݖ_HNJJͺBۻ6Cl zфW ]ta<Ӫr㬼0:@@ >/P 1ND溠P%$Ph.ӧNKg [{՚UX'}>"E`)J֖Dx!`"Ι+-8ƵY#3WlgfW%5\5^NAcoCI480Er?M7}~ߣ_<O&?&{߮6$|{XO}IW8](t=,׻k Q9^wy6Ի \p/|")gs>LDYt$gZgvh(D;LD&QqY&0W NzIKX輕L\bxYRC^VeV ՍyyyyyyywTqk;ōyyy i]i9š>?_IKQbG~8{2σ35VT$[ADXEE1 e h bd--0*qP+!*mtΗAʤSLeW% DA€; |hA N "y+td֓qwI-]i{>sH;%Ps=Mg?[;7u;}u1f#czϩu;HxP)fi K |JJӒk9d<GU`;9}mr*z]eurcPQ iA5rkN l=kVZP~wLH+ZdyR0 nl( O3V$EEH`;Pu!KّQH5ޙd*^A&/~r EUx/qRċKAۦ|VߦZ!HNQLtBS ^q) )C2j[!ǻ-KCC07 a$#P=!G`86js4s)t X\wB>9,$/[\=q-JLԞ0MT S p=QB$'T(W4²ȹ`MTZ&0F,73Dm)JA$er!X&8MPk#9ПX;]m9qUIjɛ_s۹,"G<޿1GK;Ӹ8/N;]crD)':eJg%F%E⨔O&7l:罌*a@V7ͭ`x U4ٝ9,O4ړ(1abr*%.Bq( = -2YzHϲ'_,v 'no ǤQdelS2V$NVֳj9QzY/_ *].<:zfN /8-943[?,\&)e HҮ@Mx,4'?$DwBT#(H# ؠV5'8iOOxcvE:|!(5m0)I(3sN;b8fifcծnWtli6gFܞn{>OqA]BZ:!QJy- hF$(tPIAc2"3"D= #(_5yǗw 1 E2*ӘG'ˉ*$cJ(gIԣzغU;Dec8Uѐ.SCgկKRPz Q(+BB#܈l΁N6Y !{6, /3Wcq2`l0N0_ꔸHMVU7IIQbچ-8<,uV") +ӱ z\|; @xv"vUeb?T5* !?x8hA1)mPRᕶ2砂TrIkmhV;3A(&F @ 3#vI"   i)p6 ԟ1=\ؖ7;;I%`#"Io vq cP{O^ (=.\͡J:^(|DkANP$&R2Q+ {r˓6\xH"!DD?IGsB(D؏up!;qN+qcILL,$@ "$nm _}+%6 ,`Lj ѨT ZI0{OQ(h'R"gS{8!6~&ŷ;7U~ߣg&׍qw^]-#Zs9,{D.3\N)#r_欿eb(o{}0eo\&w*SB x?>[#cg27VgT1S(-9T1-xzgYb|oS81+yDg#1 剪m *:7tEؿX >tu~Cvs;[[1*ʾ6~_ʼn޼9ܞ^*d\IFOᐓ?QPLԆ;W(BA09ԏg7#ZYy_nt5nV9[7Kz˫zmyXGZQۋU# r$.tjX9 [3YdbP(\YBcvo5c V dIu\cꁦ˜'4> }~]+uGΠNw]cK9]R8{n+DǿӇ}.o/}.>_pF@Y͗ϭ۬ϛ-qL Y|quS^12h~*[+bc/:ɇNRnV2N|F2{vˎPIv߄h!^ O6?byXk#keu$N ''e T`b_w"Y #PLxeo+C[^4!^Q'%DH`1Xc s(Ÿɔ42+eQ%\I֨۷wdŸ0MhhF=4O= 6vz;|fX7򭗹fZ"N;toGأ,l8(닫D Y:(HqӖTަL1ųn,F7T%:u")@S'TO#2hѤCpQ<2#7jˡyO!h4@(2\x :j>s j6%hؖk mE.tL+,SZ;D̼ݠ%?k쓭 I,v KMSm`BhWy`/&Z*+"0Vs" -kjA uޭ$ɽiA:XeלӐ'aRsxB LϿ] l骝b>08׍ʦ#҂dѫk|Yo[ߥ>;Pt IjGҎyP,:g!cH0Y.жZܻ ԨmJ L*1z}Q̛K&ԡWeE.R9 o{M8LP~W$PpFPU^k"VlCNf[>K5\nhtgrRrN *>>৻^\|˂3(ϳNg۸. U[_sEh}Y2KNe sYA/"ԌVo1:[+J[@*vRmҫŧTgȳ M.xp l!)UN|T|T|4'ONyX$aa*gXXdw^dRf_TĤHH$ ]),DPqTqv~dLsʼٷO?Eg8uE")*vIAKX{hwdzV} v 0Y:K־9}# dR2BJ^.4v̩zJMTW$|'t&),S#Iz1DAɁB[cTjMU"Nò:PWY^T5tWdb8ypt:}\Mal.dǂicbTmbAA& F"h5T"=uUyz=d}S nE\aN[{v{Ue's 2jW{飧)nzFVe-ݴtst3sšQH -wSzڬ=n*'ֿf]^->ҿ[\B-ay8\[ ]AW&(Lk;ݽbpy^0@MWb'ҧzHmZm?Ȇm<.ǯE3+G-;&.o92 9nI%(ބ3hcX|qh,NZBߎ'9f:AaӘĖ;'W,{jߘc5MtY"Ǟ_3D^liVU&Vz5%[G<Ѣawې+n7AFԚ0R"E+Mti=EA:P+֦5}1'揩iVcY7.AyQבZr63PAJPQ2,:72WoIiKBqA?2/d~x4-&P\1i#zw~"S^wooq$aM4Kc˽_~Ü$R0\!Y(jbuݤ6ȫ)5ЊN|l-ph!Ҕ^L*&__*jd+p'υ).õ܊@@K&74Hc!G*\b4hϔt魶PZ[tE\ KHy*'dT[p#d:>I~,GY!Qbn8j eԡ„v8-\BSJ޷ڔѶTGT_E%MwT)j05<=<7I*R!( 1$zdwu`p*zz^N׼:.p a;-j$'˙0(s C@QӤjSΦDcFUZW}CZW@̯(JQm{tleҞ漅5n}0†`*f6ב,ŧ/\Nx .I(.9T[5X*vqPCx9wFnrru[efnoS%'WpkrH*F 4NScp͠-BH`T,jY#TnэE!;P~@ >y*ӈ,"z3Z,bNeRxRlȉ2hKG%E2>~4 т>rt 5҉"7S)p&ѕz[<>KoJNy}N:,})A՛r+|IN\j*#@SjfBdN[YJ i7!R3IyV8 뛕p{}Gr|uHy,q0+6aT^r'yP`JmpDM(, ӂrTJQRgQ!1HKHxD ؈~ȉXKc_M ghMKEYPR&$<(bQ{,pŬ稽&hl[ZJ%4((RXԓVI8u:vW `  ]gЅ/ϫ A.4"El"kC0V9p6"RK U fgGXώ"މw$q¢b PAGDڈ Ź1hJEczfl $ TR$d'ɹ@m /zK''߼x(Yoc-R@, 4^.&a*Unb/HǼmTWߝ"wntDbCi~>TЙ/Y1O$gBTuA dC\5k "6*DOvQA] 2d"*Hj r##ԝA4Xp:Dk6:KX7%xfr̡\Ջmإ޽zDn%mZWP\rwIf|BdߐKhҶfvzp8*9n[-_\f7h8u;]_Yw4<+pf/ăŚ_j1Uv-\?R_.Y.[R? dCwi[v;X=#mih (%>P>4PFUL'e_)? w5Y[l0{h7a\$B<07;Oe{ }O_}l]%j 1;Xc}dkêz[#u31emWF2e)!S^ HdCP`Ik뜋U !K,j8ol@ґxTLBZͅQ[e+V#g0LB>((Lkj6-a?nnweEwEw.Em: EH4ˆ$z{@Ϭ%i$wD2#HB/:}4}CK߆EN *UZK '\1hR fcQ 9$ɑ $H& 21n{۱EVºS z=d)ZX2rR4OQ1(HF+,| F{bALӾL. te (ǓAALGfRIkC@#1' I h$_ MD{YÕ(!庣5b$^-G_?LlM:VW!By&v*4JIz_ٖw}y҅˲3g-v+Ŗ"OLS$ڻLqn({LpƁKvN+č?'/NR6g 3Y9X]`,yJR!WqO4!m3xpQy8W.f秡)\o> ӧޥklhf6kD/_N..O?jTVqP$u9o  [ m~"[qVj\^CikoeL{˩nsWb`|8܂ۥ"bWlՙ,& czlcO(t2efyM`(2ŒO_y~q>28Y[۫`{rS Z3et,}>՘WS~{ =6|[xic^8<#v˛^w?~_}͏o[.W{ߒ $rx|( |9~޽k)juMͻ;tXQ&|~M!owk=K߳oq(?^\_[M *z{q"bz6t>Y-UIbzElU! 0 i_r~!e=8Z1Hhb071[S0:mE%`hnOMc_eq2Y JBdV|q~g%_߈vȤB]XgL[4/@P:)9$i“JvNݾwGiEC]N4'xH`bH.+)(֙s<%I):%D)I2H!8M.ɾ}7Fq[d{ݑĢ>(oY/N7ƍt ]Њ|x+~8 OwznO/'eqܸ> 9BUԒԋU ڠc΋e׫pC꧂uጬ,U5,d⭮s& zL4e,$.U졤*VDؒD0dV-G TZen]AI%x#]U(PxJ!B#TD hP$g qXH9;n hǧCrdO/GfDv;؆^d f qW;Ki# > `ʅ2k#>9PNGOF2EJc`+rΪd9WI$+3f&CtrM^ t,ȹ[@KؼshKT-Qzbt_yYYY|m!Ej.8ILBo6>4dL<Γ|"ORl}d,Իqg0/ϗɼߪ~HDvh)8 VLbd~u4kq{@ET V(A+.V:#,J"|^9;8=+U6@۹0!J9< ɍT4&WhJ(asi-}pXH^<̷x@Z\C,+ay< QhHVH_p°4:H@$Q7[{ U$d}2޲0M$k/ѣS8P{9w㪢CTI@<x}[}&wѹoMQץ$Ax@ij$.Ũ$xZ ` V)#YIn5mn: ,gVl3&#VbBde.ڬ Ow@GJdLֵeG#'nTϥ3 +L@ 4L6)iOrVnnmaTz|(_yfVӊkv a=da3AHOד80l'5E3E~^!(N<(!TyoADPZ Ԫ¤It:^ƓԄm ԇQltI4*3+IղSUA]%`b=SBZ*,!QT) #hF&! 2d- pT!J4BECWy.]P,>h9hhE5qK. 0ˉ6'6]5:' 0քCjPC|WbX+ +MJ3{sHL2HM'؄3|$47?hmr9)MrɄil+hd"/Je6w>|4^GChi+ "CJYPAc6D6Ծk*!PdXw>qy`#!w^lS9{9X;oC?Ar^%8vБEAG[~<Ee*8{Tp]/=D՝5BO) P#( ULqۼ5"5F, U:Ygكȟ`ܢ9AI+6}Yg4Ȑ&\LU=Ztsr)#nU{2:hfC,EG[<ک^>$G>̑]TK ii<7oߺ؅W7{8 ..˽ b:k0x5SZGٿ׸`%m 2wI5zʒ5T{7p@PzgDAҠ:b2 / ƍ)9JT` rnV4!Gĉdeܔ-}̨0$ࣃ1D'o݀<\'81`3]kب%7Y$ X- Ɛ9KVR7佳(*bdְ@!lWBWz9uq s5ߜ:U? ݧ^DzҠ7 H?6[6Mr7m/BVFHݍEد !قtY2+1[Mi Ri?ZYӵ6F+@%0eL53deQD }G ˽E ̳ NBYZ`I'SR@&ѩ\cbD2< f-a9%(A(ˌS CBMjܽiE:q4<*p-c1waЂr8t/_{;l,)X1:g1I< 4Ǒؿ֚ ̡c.{@Vl)Tjx~>JV)ÌhU*$xx ]%(YJ)R-v*go!vOlv 7rOU^S:۩o^HyxJQQnզVxSS l QC!%8Z>lK~EO6y^y0 ty)r녏8=ꃑ?,z[ytt]A^ ͸q*xӊ>fo6k@ql ^=m|ퟶ -qk,»?".w{F-9s e=#t!>p`͠Aq14c_\k3R}F=*M0rlƍt?(PDv?n뎱7xIۯ.L8;f'Vt6XX-g6a>â}:\Wy{>1wp|rebl~>/,b`=cÂw*e{&i_t_GκTJFN&z*ZvvVmsY67P ;+m7~J^+mI!%,i<2Uq_C}bf/:CͿOz$ $UH^=V,7¨zڿL%9|}Kΐ{+.J|pLuolsD Y71Թ FRmT6k㹣9t-L+f^4D$q`fwa݃ejIɔ$\]ڌlҀ\ Rj(8y!]#¾16cXNdL <;(ȶ@*hZZRDbEy[b6#mFsFڛg{M8AԽMlP꺋zs/Sl(P_ 5 s}|+}Ml^{_xD =~&g v fkҫ+XiiWq>٧Qղ/=pV1!+ԓ9,Uw^ _z!7 ƭ*!xT:y[(s9Ͱ;؞\1{wn!βa. rUEǃovoNfRݍ/^ˢ zm;ᡕWfkw"f{|+_J=qKV}BQ[5Uʻz~²,tӼ66 /̔\U r*F[r˩'OAi*P(3Qq4PPx+=v֞Hu2>10Bl>ssYBjLjz)Y5&7*'11l S9`\'}%f[0L%D3R rG)ZΎV{45>Iz2/wj- 6qX5*oK\ΎzӘ/uɗz^k. \m &wLʷ7bip-wk| 5x3.cPfp5d+j{eW-jZ] 0^h2ǯt L5¡].*"=6|Yq&[ƊR:. rCJE.4;5(9x5HP9 ;Lg `gINhLJ)(Yg5PX//@\ćT g gL6ʣh; 4J)V:[E=LC*Qmr=SYM :b[iM")k*f,>S\؎unkpW O%bIĆ  А"p$S^X(%RR}3$]ӴM Xf0 4v6+H&ƒK6[ѻ+hWv;PefP57ԴTf AXO@CZX"aY0(3}ܙBZTSn tk)6@AEGªeB옔?%Yc1ʍj J v9q Q@Ͳ `NDX+N>I!SF@#)#- UWʮЋyqޫ.(S9 ycT]6[m #E8Pήi :2|$&=amj1"j>[)e"B7Гno*&Ɂ ((uP;0!#]y;3b1ŜQk V]GGtD6=e"  d@X8pi n:Bֱ+)e.N%1u Ç<) 䄸l_~CZTx$Q"922UDCt6Lk.HPD,_:&κvt,<vdn^X*j`4sMe*Λ6 j2tYSN/k3}p|To.!Oj3c*J+ +xF$4&2 V(%s A/V9ІHT&j]K,ۋC׀R"e`w Vj"P ]1;V@ 0@z=| %k]2iKٯ x\[ i)/QH MZ\܌=vEpUAd Q36JK, P@+Z'hwțE ת;iURn@>56 @X=Z,O nwKctՕriDFv Pf*xS6R;q뭚+*"h,efUn%e y6,qcVHm0K(F\ƒw/G0Zj6s0x7z DG,qu['f]Ӗ#X`]ۿ4-FGUYe^L@ !euhX*f cW0Cʟ._ ߴ*]lA9KE6RrȈV]~![5(TG;E:XAʸl5diгJ{0wzC A=P|?a+E ĵ\;1trp-, 9OHnj@Kw(H12t\{!Frᗳ^ :P5Z`q 6XMK@cs7v18bx;kլV &Ӌk-dS3UU]H%'+iye#uRݦlǖٌw7|5^ vUqpb5 1``0`] ] b}!Z)H@*iUpۍ>%VYRJviD|9iQdb"`ցMz[TqI8Y.H,vRUD LBDm|wW<k PrHwGظ/п~GJ,h}۴Ynp5ĩup˒]?q ͼ&F`piދV]mOk[}3{"g N N"ͯlΏ 3v@fvN 4;f'@hvN 4;f'@hvN 4;f'@hvN 4;f'@hvN 4;f'@hvN 4;f';$_vU(Vn]ǧ/Ϋ,w-g׿8bh,6$bk@%d(`}([t9ckiE޵qcEm;6@M0#FJ.~G%YǶ@ZC$ϏgF/4xL!%6ө>tN'&5:/Dv>uuS`Χfh_w}-s-1v:M=DȔC>D"NkD)QBTÂC(wdSgwGzuV |LxD婀\)&)4 J{4AiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAiAi/ b@`͏ r̎ eБ@JSD$#৩HHHHHHHHHHHHHHHHHHHHHHH"\cB h@@RJH HHHHHHHHHHHHHHHHHHHHHHHZ|oj$?ʔjf]ׇ& W@ \dž \.xJB

Ǣ JTr TWhICL\JWWiOg%+)fU"XwXԕ az*QIDVW/P])G`Ng*ձD⇮^JV4zxznKK?|yÜ)}~:݌{& '\az̨,LUO@OKUW1JE e/,MP(,l%ևR.I7 1O V@R^Hܤ$m(,x I.!gdi A lS~t'HIs$d}R$\qp 7$\6Ӊ~%Ma?qSrMOAOW^8kuGםqY{ ;ww#c]lh>ˁ}yYS3~8lVY{ $x(Qyd3f4;=d:F LgZ+Iͧ&f!fD~tK? ]1wcV Dlr}Cgkw؆N} hOס쌠h'pH}v:߆Eu|qY+'CT.,flQ',q^{gD!!WW˫tGͬ&lx0mِ߄_Ӱx>xkF׳N>a}lͺXNoeo2p<bא"}_WKaEe?{lunƥ bR/+p# vk(-^,;۸rwkzLtGzo[?zvō΃;Ӡχ~Ni v|Wӷ;t8_Qp>az_׍޽`Ti%mLVҪFb`U_>P@ZOIr=l*zݾuвEíwmy]x~I<؛p/q NA}9nMf;$Qw2龡 ƹQ`̠[@+ 9Fn1uj"bN1φZTlt⺱ neleǭ-N>F;]-<ӾH.?몳up +0ǟ&=]!uDhQhc7;}ۿceF}ϾZ1q-l>{ NuzF-56{ZiI*%9D> >fd(C|_bVL.&t|/&иϦVB &*!qf3>zXdG9ܓ!zKxjÎ^gDScQhQK p@Dr+\o5&zFW[#դC`uM`uټ}F|!A" ),UXnICr~􀬶;JFqE6%8zJFSY.e`P1W1Yr[aHY+LuLM+!ƈtA*h-&i3JL*$#RGoBQE9k嬖1QS."w9(K <-`DknX4Ԃ /\"Ђ|ZQ;gU@G1΅2GMP7 5prHjjQ/I9N ggu͞+T +4"fj%WƐ0P&1ZɕHsc"8 gQx@{]RZ߰^goN@I r ~x3mi:%OK8OsRS o!x`.wD0):kE>ܲFvb>)$6z L /0DBr$$!SqµDicF qȾAZ7AU26㇠IQ7.0;.ݘzK{w?9߂6NΜ?v\{T#nKgq'[-$p*8_QF4IY>FuqSuwrwe{ &;NבGiJ)vis 0EtNSwj*i5%xszhM+0ggE-6`M@ZX.k[C ]wYË˫EX\V痾wL ꝺ30"iV VZngXP&Tzթ۩ 9)&*40g LH^]I yYSӺ3izɓǣ7b4Au1eu'kK|m6_ǣDڔChHm#lln"A: V0bw;]j>f;29JQ >tF4WJt0a*#CWb'툟SO@+zm ?&uw?w/._u7~z VU HXX}IA݇椭aTY7 ˸)owŮڮZ(:w+ׇoǠO:t[MJ? b~3"wkq*-_>~}iG,/[ceonbU s+u YW^rP/Z|Ԃd k,)PV8d׽>ׅہJ{ & ȥHSQ"Q"KE.GEa:G/cq`G@[AXAMkZ /FҲs9;GZ+GqJ['^Tj(l~b_:tGڥ:5۳f\Ɠz1: 4:MIjƭ{o-h@`ֲDD "KŬs.Vw(Q.WSRт<`%\0Džk$HAڔU0bQ T+냉KM` Xy~2vtbk-<a!9rL9gacCiד-n'Vhn'm W=W5Ԥ-WC#XT\D\1Pa^qʐXF+NcF1ZleF͝QFGtb Q)wtjY[#~ ^g}rM石gG+_TڶRicIqE,)HkXeΧ惡8vTlB$IڱPfUxЅ`RXr$H䣕VjCV-Rg=BW.WѢV-jqea$c9Sv6ke%NALa^1hBx TQ͍Q!:!5ʷ5ɑsRVBgzɑ_yE2W}}`=/;tVInvubS%uئ(,gRLff|_ c(|Ҁk>mڝFF8?O§R~(OQ(~Vc|x)R7R*#漵 +,hThU6 0Өi#kG4Wh >Ym7NE2"ar)h@[Œg ʥԒ4]:Y3xj$У.bk=԰K.x-f\O?Axc)q39smώ:eow=S"Li[yP9+3ReBؗFףF׋Ƅ( ,*¯*&A9 YS$7be)Z))7x钰6ZmђPEaE *yhШRj^;%yNJcl8GPą^3tc b޸2iv.9vWpN aUM`v |<|Ut%Q݄mLNҟw$9GE 3뉹 !d8֏IZzWSN>>xzqQf_ZΖFh^["/=!:[0 vyn(J|ƗqvjH8_̯.ͨ/MJ2_X %n!ޯcJhrI$YAt%c@*98KBI&9EţQ6$>!@;;J׵UףּMY[.S.;= dy>`>3k+boN >z0;ՔԓLjk|A6-t\] 67듖{0>hpLi&J/i{G,Ȣ 'z앺4K{F4'l;zl2?vJnkVlx!% վ;вa^'NڤFbb]<=Z.,;m҆hN\|r+Z./>7i2Y(%x-=W`2/UiMp 夕0՜#2wA@0LY RF>LNVDXRxLEm⢮Yײս0[LSXb>-&??b,敏\pxtaYW4Sd.Gu΃:8/'q9z;!@qCG3R ye_3Y@88q9y^RLΗQnr9kU6XrNIѓ#D3E@I—P|: '+^ E?>RXP,!dOjEm`@hS=_َϿʖZ eIKB  l5XZCΣ ZQ`0,^" Ķz5%mR!I72 ɖ䤅/B_HI֌͆sX36d)-%ɳ%q7]oW?Eoy-B4$9 AV-DAc2%S{.%BZҶcLPieG@XQģcEPd:!S6AY*B bY6F" x#0vLH6 H{;($/;uЫzS,٨u|DTNrpC$i2_"-4Cb_˼ECk0YY,yPĐp9$;<1zHvβK8lHLR)pUd֩MA: JUUPJH#8/JZ`ш*JEZiЏ4p(㤛 0$ Ͷ4t" ٮgj~I_!X'}Im1P3QL+5\n0Cg3#,7`qyݫ_iO?ӑzttEv\|?O W0Y~pQn9;hWq/{(.i4zQȑ~fn gQ}q>iI@IK&{ ˦ۄowŤ !˟VtnMFd:ݰvš?uFE?. ^hongBH翎iJ3_hT>~?IvlzGzy6Zz4W6$] Y8_mMȍWlNw3SJ#=X+Due.(AIbp.inX ¤,} beXYGw ۝=nzKoagF9#2Z[N^BNd.aʄRHZs@I< wJ;Q޽gn {Vb;kJw g[2|l ?P3Q=#/bXH k)yfME67ؓXyۣXyۋXye6FWwCFNSBob`>9oA2bݥ/- p<'m7ا@ۗ {ɜ]upiftw.~{^GǮ}%B )^4 rWurkkhݰJq{w>|{|߸EU-l:K?Brj t|2IߓAq3 TX]%EI~p˶ _?,CT16Xl֙EQ)u`LJwI鮣5)u)0zO-[m$-I ̹}EkVgw}[E3i|ЯLՓ,f>s,]F djnjWy1ҁ< m}8㒔IQ dҐSp |iM0Q%E9&kj fj9xǹȌNڀ@jQ+thoVYye;w#lq .=o|nqUsw\du%س~ROn] U$))$  ϊQ)S2n!t{*ibVm\FfV}! xgKqАM4ړ(qDibvYPBf Imjk!=/l1HMt9 5XP:i1Qebs6V됁{ql8GfN/ *~ ,3HYiMH񞧨XH QH@7n@Kܽ_*#cZ)*ĨGTJ["` HUI^F# 8PfT+ތ?iw%np:cLC56H0!ISS#)RQVP%+wF Tn&ϪM:E0>LĴtE;bPo(%[ΊqȰ`6̸("a@i6lW<9ztVnLJı*#9$BhR(X+/-P ԖAY ]&g!%OP*B."HoD)3dC2Nxp`-}jvAzykkӖc10|&X^,kj;G?F^h"oeqH戒7d8zf5l]k&W,M2h4 )uQBDAUAbpOc'P$y]jl: gw<x@]&yM/B䜜#rR6C*E RIx?YmZ%$&Zٚ($ffFks1UXGt@zgw5):rSdh'qgz9_!ef-)ˀ0'4*m  7ӄtU^ܿ*('|oqc駏6w$Yz}$rk i.7jE]th #-[ mxOg !itʘf;HO|_?fep'4NYYIi"}P1zLWLT2bVѫp[Tՙ*VgNX])f#Jmт*qP+!(mtd){!"F2j RbL>hA DV&n##k=0BlgGC5 >ݟ@4/r8~t`+jۓu3Z*ʿL,:&>r)'0-Yg"E0y?eJӒkxDdy.EW-z\;OMNo˘8f%:!TM2c&kqR.o\<;[jKm+^N۬W+iYh'a.~J/~jϗοtm3ӾHM<| jZ2 rx|Uy @5 {yq&j=Y}8g'#X* p8 v딀"uƷ~S0faVط"?|'F+>).@4@%Wa"i4NrjlQ_p-VYVy^%h* %0@[ecyʇǛpb]4DZ$*,V(c j`9#ȳ޷ϑrǁn?V^4Rx:̲ge N>؍Mw'[gة .txf̮T^h[+-{j]]8څZOvҴuĺ?vЛTbRB$7~jJr jP`U2O8V>v>Z jq*X37)Zr9% dKT.&\g EQ{i-FK{%&~ \ԌB::$F!^ȭ Vy0Z*mr:sRE10=x`)j 8˧ ː"jHKyg8\(WNq3<:#uj҆bL12i+UBxu_-'ךf閫aSeE5@&G,J:Uk^e9rܙ L d[ {"@V'd-7 yI:UhpGP)R|D,tO0uouL zXίjߜהPg(sY "iqBF E$V^A }h:mXcfe17̅;0e^_nF;/U+?"b|۪֬[TUVQ+JH'3V8U| ,R}PX}Ќqf* [Yu) k+ENBKki6pgDb!&nǟ?/ѦxӍbOˋ`z9y%OI,,hK@N  \}ƒ0)+^:Brɧ!ޞާW(-Ch{ ,Y̴sH ]BŌgϞ+Ѭ 0DC51Ev!)gO1yxH7FH%QE4д`Sft=/ˑ;*܂}j;ilɪN^~M0zt_ PPk o{rړ3j+gv/`xl,MJ"ϜZ*NdtVKAkJ!Zӡn%!{2724R[c̈$C, C`PDIQs-= |&!մdlݒAҤYRV~T~,(*c/ ;}XowWIߏ|OT|uot)He+3EQ0((R@j4h&¬\sHmByd8ko G4aYh Tttr 5#dbbA$ltyYEHH'&Z{C<,gL 8Hi20F N|!x`7BI7m0rZi䴳0"Ć؍$"jVu&Ӯ^;+hlMb<%)4蕺UnDCi*ý ']%N!3o> vX)#O(ADbeL&#D,PJ[R6jP (%CsI)LZЧ!-_P.;6m:i[?ٮbĴ`?XTi>C BJlԏ5EnermWV¸4<@jrZcEL]9?vtV&sMK?]_/BoYggo?XďCj MjGct%֣TJQt"X;\u6 KlN4{:Q ӿ銁Wΐ˵VF h~)+1+˴ yYҜ C?%bEtuhK̳M\;dY9ɤ =S^1L ٬͌"!8i]˿/o}u]u09DЙAΥp $2͙DŌfvwe\-k%A$B˳m V0` #i! pőkɡ"e+v19#%.e֐ֈ![ ^lƕeI? ɇݬ\׭[-;q(լs'~d#CDiM'Bh90PktvZԈgP<hr[ѳQI8:k@@RFB5$r9l@LU(8/L&"ʣB|z҂+mg0$,dsʅqØНf۲}(80)L̘6́IˎD[(%v;޹p#oajQIe0AVvQ<杽Nsx[:Y+>{4 H! ˗/\}X򂁂h8315#9댺@?{WH_%s$R\,؋7t9w?ct8I]'㪔YU$Q$?0)NI&qb8={yN{Xb{4F|W ]KDȳ58 aliz6LաrOf;r{FWG\xm]3sR?StD:M`GC#zBӤ{:MJFzwHC=Fz<KXU+;tW^i}D hܕk=wդ]-ꮚΏ+V]hUcqWMZ{`]Kwe :+lS{M)v Mr{C_)Tb7`I~\i#Jp1󇢲ZՌG**HJ.i.F'KR;Xl[h{FM~\^^>t3t)K3cˇn63빼|5̂|G ]2Vi7^D'6* ֞zlHQdC,> ТŁ6#QKҋF#z=u3.5e1p(^,g@kb6d k{oT^Y:87Gu4q;E)%lH?ons8f_'Ki%Rtvfr uN=?U!L{SoZT'O1?9c4U@b:DJLdܷ)r'oZ0Z6qҾ0U 8hsLS@l*vم\f Ӑz߹uʺ Kt[ҋC8 TKiK3W«ĄYdzzϴP5&tp*C "~TK}{fp@PF[cTFL\AqL. Rh=Rτ!&*ZJVvQm2*YB$"F'?ƑSP{ta*zXЇZۙ Ymɜ-83ӧ3_gZc=!&1'P%Wf4N13 LNfr^[eT_C9ۻkWn54:Хd==' vobV妛E,|4ǰs-$UvH/,kK^ R9@?N\-?u'*?GX~WoD"5t(_A K! iIcw+U +]28XZ. VPb"@#/V^,ח TwKN[s3?|ӾRtZ RVt69 @ hAt 6蒏1i3~Fb`-خ+7Pr<\b6fv̶v ); ?Rvgr%C8W5`5NYiT8e)=q3ڢbLNF5^t:Zl Ƙl {bR< +:6h"d"X*hs c@CU&qE()rgtM( !8~}TWus7k[LM3TSӺ͹ M*J\%HʑNԝrqH^a/BG0:Wp ,qZ$(`D5L$b*1j99 Tq#FFiMf,Spc* jMf ,GaxG$YC;9$'in<ٮnNwm:+hnL4'iiLE?YZHEł9o6>pswK>}ajʼn{!_oy@Vۆ.&݌wqV7Q?_m- x+p q )\-=ݾ|/7BiR(&gjT#&%i6)@Մ#No/isC8Og<Ħ*!fQA"1_畯>a*1lcu4Nh[8 2rL﷑>6э5uaσ?TQl :b;Rkgz~.gr/ϵg.¤ g!afe6@ SdhuɌtC̦̦+&nNܠNF%d TmVj ;F= a}:nfwΓ) pQ 4bhLĘzuzF] 1ݽ28,P0wtG>ũ1v}ؕt]5kHVnP݀%sr*iRƀu X8w(nEiOB4^uYQ|>]f˫ow?ߍ8㎢<>pT+VoU^WJ(l5^5#JS G#mgvUC:H.ݠa¦|n'{ÙC`t}$fSMFPD=M L)Kʅ\RI~wu ,KKQET\I:b7΀QS9:|{wa-gmɽnjn|W%MOHΉd~*||dV_"hټtQz%_i.U{ǏǏfWWޡDe] IU%k5xDP@",FKVF]͹64\Aq=&o]M%Z1I-*Ng&Þq;Jy^D_ht 83G4ܶ)̾= 7ֈ͗~tÿ9a_/Χ7౻6PQ$lYFALh@.F gW%SA.TSGLvg%Lª Eoe9V`F 'ƔJd%!jJA zXۗ6lV .AsrCS!BsqӘLL6$FK-m}ZQ[z0:tAWܤ1;TE0S>I"0"ͤ&.cܤp蓒+:;NJ~?ٽMJ^^%b}Xo#bMhy8˟C5[e(ImpݫvyJ.ڨXO|6yYU]]oB!t}I`Sn$eȈusU[=d;сZP/ux[w|_^$E%Zs+蜜[eUd!}Mkiru!Md}4q0oo90{N k t~"Ϳ?5XAv|X;9ֽUFUxlSbwX"S@5ju;`8ə$Ѐ9#@IV42;2B? q5}J*WPPER9!xR|x{P/rv}rчzTܹf?p y•nC/Ε}?g߮|SEfdO͛%}Dd9#Iv ^'Ԝ1lSuEϿtcHCN c(tX-ELg S%.cOfڽLL_}qq}pv5+rb;"WBZĨLw%qSnVnʳM*Ps COŪ@jف?0OW%;DY6cN i:b?)arI5U;+;66#HWIk]%Gjׄr%!2K,P11Rl3Yk-F @9aM([ćL֛w6gRktSUlRIwmmzH]7AIr8}@Ř",di(Ȱ.3隞|U]]Պث.Ľ >bm|\Ax3L{(TYYB4R: FьMBA:bU休3HrҾ^T[mE+|eW[c& &peQJq6'ˤ֐ ģ=XW:D!crR kp tIU_^ߋҕ(5('.䉒M lB4>gK (ΰ*kf&UdRy%FPnWܯwՙlimxtY$k.qxf2%)2h㔐)/$-0fYs]t|i,}SBiIG:%&ߧP3 w'+^ˍ;K2mi?dIK$VeÂcIJg֒e:[2-# sIh%^ط*]4vxQ&8"WK$t24)(5QHY$@%p/X314LM-y{~ ;۰FXܣI{RFS)J fzz7bB-\}sz#?kqc$^{?MEh+yseWKc] 84NIz_Y ̻\TarBELzn> ԱLkl·״+ƽC|̵Ů~XZ~p? |^)NV~]E]c/?Bd7^ݢRL_pkbC SV_-fyX#DWB6f+pCVN}Ӣ +s0r`N骅/2zA8LB,f%!X2h+\߉y;uJadR. HYk`3~Jr6TNJIg$Sё~mBx+kE.>JVJu&\$"ե*$C^r"^+I}>C,;0`o4V֪?_LKqR s Y YWQKR/VU0j;/*Wy]ćO\tGo pw\_u2&KJj^ƌH\ B&sgXLc /sAX)2"¶DWDBق5Ƅ,pfE5r6Z3(>x5!μآ(Rﳃk=fZč =w< vɅ3Y[F ct+HW=V.3QytNv&!j ( AԡH@&2±Ƒڱ-#2,<.TW3r\fD|Vtv'wF0!Vrz\ 1mA͋> yLpU&rm'H H@+x*A9] wV%˹qIVg>qM䚼R 3ԲFf g}r\@U럋s+_]u)fDγ|*uS{pe2OYK-˟ScyC!v RAQQG\G:M^f>AavN#?-^F.Gߟ{0j-Ij!iJrQSeeO B-g!A~Y:zѸlG_*:PԥŌI պ%Kormx6Vt_ Z7̴s?^571;Sݪ}>ENx8i zlv2]{;7qw5/H%\_2g{WOX aWmdVzn~ฅ lT)Bl3@EʓvYk6JԻ``^΂u$N}V 7c 3F3|(Kc۰ݸX|':'hR^@_:>E-AEo3 >[8;9>nuW㹛7pi_~0ē:FOZyٹ w*dwReUxbgrDx el5JA*W%`v2e![}\c3#3pA<`z]>:E8%LJm95z$5θ.]](x w g^$5'[oO&at7Tn0}?4.  EmAWD {,&*Iy{EfGø\n;e l*K78 iR1D&(t]b,# 16bT:$X'c21Ȉ0$JsE.dxTRmiAm؊N:I$mvX$ȮLy6߭Ю qo%+=QE[OVIQOZh4I>$Ҳ)i+͝ KJ.3uyw/^{Yw8A 'u3ʔ F *0sdT)(ͺX"RCf*cV2DJ3Ȣhj[lz2[ \K|RlztI:3ob 1qKIZ#Cb< iẅ%𾚌.!UR*$ʼnbFM*$*+%V2g t-ia_1nx) c WBqk},ƒBfΊWorw.YZpGPckSo0* ?ޔ+6#^ik=GmWs~88\z\h]U7)7`WC-Em(ݳdFoTYkJ;HR 0wWK9ha(D ]LAAJq&4:G+oXJpfU^uwL:ɜ|Pd5:n11 1NE Bj teFs)!wUklZ?{OXn%ñɪ6 U**R_tWWaܭlg)g"%L&EҔSKVSǺ謇d_|9JKǣ I/}oH !E`KT)Q #A %B8j0gDԘIkB T. JRqHBI(g{56曁cqcAz&A%} fyx9]\ h'> ;T?}+ X? '24.Ax2T,%G_?!C9XJX83=ixFYYZsy2T(K`Yt2;'AHC 2r13y{ӄTuA{= z8N^sy/eiL R$N3]څ)iˈA eb95'U-G š*CۀzRڲ;d#}p ~Mj;gCʅdaQ̩I"R ̘ۗI5c|_[g%/߷(p̙XK.tGO 3ad<{fr5hD 2zd~>Zq-c~ycd\ygsЅ2B >r:3d{@nmW.}Ut~cb:ܑeJBG7LH-Gbi>Hzn:3nyef|F[>ݞp-H<o8|1|fʮa 5_G3uYb>>A6OTDdv! Z\݈9%2MĉD)\)`jB=ͭ6NahgTD`bvY$_IG64Vfj+!(e1vB2"gi8:F(RAZ;lLgّft; e} Z P-,:~gN1 ӚA sx/RT * *$b`Ү'$Q_)fJRQ;bzH1VBpMur1Fcj5StItC]>2Ӱֈl5LH`s{bfRJ0"dsΨq4aU6NU6yhY6fRkKhǚ(R^DKe`DSdTƤmmŝolFv5]/zr( E%>٘}Yvbe )% L2>hٖ*du,Z9g # Ebm@d(e %9v  !]5o֡?:=YZX>͵)LCg&+# 6T s,cXG] (ri[sVIx\?TPyOG%ʻM-3eX0 3IuQFU)0[) {*=4<[SmpRB0k~"2$S(d&B}aEKDVr2[535Z1l0+3'gwxS7hn ysŸN(V[Q:U UoDNhY z.y,J K󟾫w0+SaK3 LfWK#oM| ,_oxbQN+RnNO9aS>3F8%>-q;^\I;-QN)qQr[c 8eݰ9vUW]nrݪ斦'54/i.}::ՄW97'XYlR&Ռ;|8|/ㇿc(~:Xsׁr ,~}iK֢˺׬[=3hloG\}ar> :Tӭd*]]QZ~f2|93z*-[iB,)޸]U,m˕UurD٤.=Keu? *LD)`-hieO~.̐O)߶Q5%4A 0'u@a8 TAb1R)|VQҷ#K|iBW6M$"#|bעҀyCqpL^vVelJPMP 'I>7[uo5zx+F<_ť?9{I|.i0Tg Q^uԎ]:Q{Qr}įRh`ς܊D[L(U V%aϩ1" d)B etr5|d)`k؎J6(tX _u+flWswЏd6n0f^CaC ٛ>Ь K8S{JV*x_MD o=CPɂPo~ЭvDipER:Bvu(T{%}oCi9ޚM4KnVY2R)MQituƻ3GiD5֬|S& m$0V5)mHL!lI{-fi`zf;A XcǢ!ٳf^64N.@r :)lM mk< #5SNȗ_/4'h5c(9˗ҍۛV43v(~\FFMyv ,~O;%d4W{ ^iXOfW,pqõ 2njp8N@ ;u!6SLusTTF_hWY$QdB  pj/K!aX;PW~TM  )[{6W{'8̌Vd`0JJ+p+t1W!0{66r;0)*ǫ!WѯHbG @xrxJ0 F2N.J~&vR~&vgl鰫® Z;*P.I]vU.O]{2#N]pdAA;*P6%+S$N]Vɰ:=z튠%eW/]Y/ˮ`ɰz h-?z@)lˮ^ F |^]YoovG2/Y;OgbM)b֕BuJ:Ky_8򜄢%t\椘udk9 c$}4.;茄':U|;}/PM/6aJ$#MT.N *:ԙge\ s"vݫߦKݫ{o>%aͽA]^ ]3d h\X=Uú|}Sp_|W7tqAuEL1᠊V^U։Xy5$'|!쌦۝gTFC# nN> icv@Hc!s]1pF0dd%lU9Y=g}^ڷe܃^aK8VL;Z|J FݡV] q+wf66sʮK5.b풴X&Y)!ZT!‵LGr!&霹"- ZIDkduWsi+HR}]% {6C|"dΟߙ{%O2b+ɟu:ߦMy}vuG?=?kJo}]{>6_+iH禭I+9Lrf4gŐ\fy^Yn' +഑oh Wc<_-k Ev%>0mc6 \4V䲐l93VсEBK,Hʖ<#2H0:h Cv0ux z=.5!z_Y$?\tm4JHy@Bl)D.D1~y,TJ_ @ q,R=z*E)oP(_{HԏuᰏnrR"q,Ze'$3S6I$NU0iR^ dU ιجv} eh,J0q #RIy"47Ζx~LW:ʈ׹C6h(DdeX,8FxH?H?2#&_u 1\^`4GyRetD0V"֐1JBR 0x}-=E[֭ؖ> gPY{ 7i:X2pRD. OQ$cQ$" (rvE'AöM%`2  U'k435ZC '#.D_-!5KH;}6u+wB-o?jCf/u~ō }!Stx_w{qz"xO\I*7"Ҥ3NAO9q7yv]X^*0;yue{D;iI;r5.p-r θdl>-%̍'RٳOKaV %C v7~ 醋:_`QOE}b?뛺KgjF2%f՜'W*%h\Io|zwi #-[,m}MDH]ܨRr}geM;5#Z_cדo{An}srov颈P|sk`lL 3]FFnX2#B?y`"ޗFo^u''Z;`g/rݬ*474=.IiN#ebE(Ŀr܈3LRozPՄO;;$tow͏?-⟿o]pa/Ż ځI %yK+ gLӛ_ZSK [LYm6&f{HYl H_?Џ?va@AkOq:HME*;qjZ;5T fUY%b7E`ݽޕ[򲍱FZ&mS r:=sen~1M/HN%@FCE*߮eiWr~F B- (=cM0SyA'tJqL O"ku};2G&4L<&H e&*2fVѩp۬Xՙ#jVgY]if2U WJCP-ɔR" odkTC ZRSiNn}L.hP DrV&n##=:b;ذ8[^ p4D{Iie׋'jߪ+w=Ը1U#Mz;}AH1ؒpu&p1r哓ڙH)U\IR4Kz:!q$8Pgf6KprCV xja Wzt\uųUڦViiֲ|" Ծ`eׁӶR*h9{Vj_%WJM}E+5նRph[Y빀c8Q_"kR Tg)+ҮmF{2ۋ]|ڳj*)unY'X`;e]fٱ^+v[; jd7{/G$qQh-cF \0Vrٻr#W|fbd>F%Gx[Ped@F{YȪSIs:M5z$ygw~Fhێiv\8gW }zRgQ ̝<T.hUhT}܈F"мOVSJSsLAk'lj RI 4[͒V]G+|OXKאvH W&%;ӏޘn7\;:;%e$,y6w#`1Iw uvtVV춐s itg"1I/;iyNJE d`F76GY+$|ªe.qkUN-q]ǥDL"'s()YN]JrW zOvLwos vKԳJc\R)/ hi@ b{c"xdz Hfc([A/z< d7qx^l![Rb]{m^uҴ~RxJ><43}4t?_%yERSF&W$.c]N :g좵U#H&>;"|`!K^^ᇼ8)l"W5qa֝6է_9䲌3jW[AOMq>ozg?hV'+FᦐnPA)y>*ٺ6Ju"J$?®ڇrf?B*YIAYm'ы. BRRoBMz5-7y1kȐLg+ue(1DlH up#1%i00/1JV|fDlWS>޴6ݛVUCCyn-cGߌzqD|koMˎ ɃYKΘ9+ DZ#p&4b lw/8ܐ">tߜʿ1 +͔ fJy2G՗δA:+wp2q^߉!H )yJ)X< }A%@B,;@gs:υ؉rf*+,cx;SWe2W_//Ax\mc}mӷ ofRgfp.@Du&),-(f4Sc(ڲGA2WK6`hz1$bJrMAv1!pS48R}*BC@pxPJ7Gz3%&6sn}o=vP,6v/x7 <¸&'b1zs+H R-΄POOf_RӐ!SGJԂ ajrs6QLU`HenSy ]qXpAʼn[}1e7Ts*H;'V)iQN}ac9U7XR 6/A 5gڞe=@jX3Y/bFwAiD7{0"$RPJ&eTDmeL{9@K z[5_9#}VM,?T8i jlWН+|ſTRw=\IN<3CX,h`vOo;|xt9&7 S[  d[\<܃BR:/ԃ0wn s }Pm2M6&d~om2M6&d~om2-0d~[zqm2M%5&d~om2MUzH/t>&?ן/Gd柩f3ӱL'1κ;E˥dkR^"Lq|u7Xy?EiL]U}<ծSߓöj8 #j2,w)Hr_Rp)L"'sNx6auᑳ\{?Z~o{W.߽%/="8?9ry('5ŌDgU]f8r_u"HFjٍYiBfj͌pR MiXT$uܰ<./6w;pt~8/wXD0FB'C4/a&٠+`9-gQ-ȒnC|&B66Qˠe2h12H1tʈ]m8;Gsy.]mvڢ2jڍn=5G " |0Q1FX66bg:o3W"71!E̐KQhkFMđ0!eE(+aީ_`<D6?EDYeCĆ5ђϴ䶢 HB hF \Fxee] mozYZ7maG0 s5vX)d{@ (l!r3>Z(doO8:PsEpE>D=~7S:ӭ 'w/`eLry= w.`JןeGRʇG]#Lzѳи~u|2к<:FCoU/HpHY(m+*]M-X-ѵ{x@kQ {]?lRm=.`Ws̮'|/闟/h Զ_m%9/? h6ד˓zӿ^0^hzѯd!azwߦ52uL+M'.xr.Z-HAв2 )(+SEVfP%xę@$Syv[MZ*#eٻ6rdW mR$M3`6ؗ ^mMdcɹغXHVlڒXjuXysIݪSVʓKe \g@"b aa2>U=_ KGeuMYqSLSQ'p[y[>ĜwCZ~kr6fL9F) 8!@39? Fx"$5 H,YNu Zɭ9*})/al7/uZNt$'q*Ld$>hgAQ32usp7o+ڧB6@$.5U>c9z7Mb?7?~s|ի ?G+6 w[U>=YoOe17ta;G"~<־łEL)ĔvjKJgxT4SlAr "QWT*'(G !7m *›>0J_tUuJ\` 4a*gyFd:g ˈ`JZF⬕ Vۤ'VC߃rr}WlٖQz[zJ*0QC F~I7k܅F \6 : U ڑYܡY܉YePs 3 L/Y ZLL"ycEPrc(nbݒV@g|SٞUNoOd%j83wQT] XvdЖqbi11I aƔ jdrV 嬕~b2x姐MDK*RxZ20)8bPd~/ 'q`JNjgz=edi0DwBT(Q;$AB@Vj:ZKAjP+LI }?ys芴l!(-h 0Z2:3I5SU8M.vY_5W>YfFi/.!-x֖(-RH6)yS#=9o̥."R1"B]'[ݙX+ ߼&(c(|*:sDC\B"r"'fPdL eTxkV܊tTJ,9A/Ű7X]ni[>qvʌH|yBB8l2 -ꟕPt;dM6^615)P (W+uzjOnW,qТ2C)?\N XJse* ιXkB E GtC/%D13d}4hY9+Z.x-W.~~& EH Ȃc)Zg֒e4b"[HGv$x>hbLis7/کg2: S !AF *X4䌒D$GTE q*-CaVGn M?ʒX&뽉NӋe))E˥)*L2fzk+OԆORH0H_ e`Ko NC-ˆ͕%"r&`T<&g10rS Ǘ%3gߋ!6IbK/7JFdw͙,q95()&8㒉3vpl\7S?z0<=Ḫ#3Y9X]`,y^Ln!WsƯ pԞ[.JI!sa:\Lz~L_{gU$U}5kV|y89=;\ZK}8Fwѐߜ8Hd?/V IGP0zW476\镟 /&',[]Ad^RXmŤ= ;HWت[9G:]5X9\3efyOp(ZŒ_?OLN rT֏:QWUJi.ez\ߧG/|#.5Y LR(V%"7oDia59قo2.֜qoحC z]|y=o0ηAݟti5+U0+D6 r GT:U˷NcB0Vn1>Zi[ KcQ=`-p8OG6~gt*aY *B(U0V2X'Fg#v괆Ȕ]2Eg-ZJϘ&e:L&ӂ'Ꜻw<_hr$:\&W%.)Hi!rNQ[S2Ȅ%wPU6v*_O|'يܪk.YV`.j;VJ0>Z㊹?MJqܸ s IhA| ׁy]ItXt'ox MBJe#V,a&!J9x3d,xf ITG Y a["k"Hl"b{C? YTGȹ]BAKŕSZ;fLyA5ƴ'{WO<7j`l(17-vgKXN.D@>L55we&2~ѩDo{WmmvvK ӂ*qPAi3$SHI7Y*B|. [&FoDsL9܇gO3"nNlCm{r ]9b.&'{z>%qnE$80t,.D\v&ʈJӒkx:s\,W٣wV'˹q$8LVfܐW&c&*Kp5r[qw ǥv^u{^vby_y[,-_Qr'ciJ` ]5(@s,+u~~~5FT@37)XڒW@Fr *( "ĺUFټ#@5 ӍVGėӻ{`{|l8vh;H1%VGGl_w\̟F u /bcQ $Ƕ w]  ۝GpRd%"U`d+ >l`F3Y+$]LU!N*F\8&3B'AEjN[%XZ9@ĩ^%dh )CZϵdܥq>PN!8!S.AfaALWW>-?vGيz VK^Vqz K-vϧ] "ȽQW'ЮUٮBzX#uEF]r ":$ SWQ]aΨH]5uU5r_U⮫+3TWVX}RWD r싺*"uuU穮psfb0Oz?ϯ~e޼LW/\Ќqfi lf(XZaꅖ\,Uv* jN" О%[,3ڈ2;Fr0+S=d )EUUa_Z[,T>WGN5.N z͔7ƍPĹ 1/7[}*_}z }M}%}/)FPrQ[QE3f靹qQqte%1eRpi/}i\nj~<'G1&.M$kЉx5"7I8h!;3V㚫=9k̲ɴoaSq/`]Ah#zCʐďB+h 2jtSA٭K@\Y³^ٻ6v$W<.c/afs؃ݷ=bbXE5&''ՙ]q~Cܷw2'Px}˿Lퟝ~JeOi<]vb[rUQpdgrRFj^FdVKǕ1KI ТUTe&s+cb#ٜR`N, !?.:mS RrVb2V!HFjFVʫ`a58 XxV,\rTmvW&4V9icz?]ɟhp8>~̖F`T!Y2Hf2# ) !r8ndV4 I0 ()FQ O ǜByT䈙.\È QZq./Iǩ-+Q`vF%8E> YsB&3jǨs0%K֙w>XVșYi@eq%Zš,ECcCDP\^5vaԗfok8D&"Gv{fD dW0cƨK{6dg A$[Ġfש7D0u+]2&fQpブYd%6iL͂I!VS*KSR3K< rsK*-K12#V9(bMj޴"L/lo,"l燎[" aSq~Rs lgLJEJ,#J2qP6:0'VQz{;WӴv$9ТfŮy:[޿ߘ7R8o_/P{aɧ_SL),4.WM\DaI’/,yW5Ҏ)cJ/?fi6t.8㺘?^1;qyʈyZkpL(%AӍ@^w{|Uu 1JYaAe^*a2 N"w\ Wv櫱sd] BvQ$ޜ/cx )l1nۛ+rЮh%\r?`+u[I_f/e2LEm9BW"LSbY9QZ3mI=cV[2pD(܉q =00pD~ZLBD@ H¢f'1\''R}$`;wvզoSêZ Zt;8Mw9x#lX +Cǜ#Tr+*!5h-΄P:3̾zvBO< y'V 7HVgkIznIUK7ϞǓQwn/ao oou 9DݞHAݲs Zi]9/$q.EWgջJq?|zKH:qb'NNQ{8Qy$ay0ScȫTe>?q w-qrevTvI9xrFnjCoo6xn!)H;B%z]$=~ s |^); ]qbϋK}(Ss7ȋ?Oߕ=3%;˙Y|=uihz=Bn':#oC <3K]Eԍ:,J.јtv lOb9 {nu$,;u!VLJ]~d6?@v*|< ףSz0`ɽͳ wm(7Hlߘ?wdPF7Rˑ9R!*}IY +#dYaV_7 &IL?.(3kШMCM99EA}ﶿuAZ~͟T|pBqsRDR%Vh}]ulhM9nsWijwkkoq9س!Y.&qnʢ$tL1R(2$ZLɢrBfMʼn 7ƕnpj{qHU ʗ| %VR(.$,))Hmɘe XKa<$ȝ*gtǬLEN(Y𲲞UcH=kwf;{]:=]μ&iC'KQT&dBJ @ ToE;>_YUD4W(EE<*3C*# KcS$I\d&V5S+ΆIKzlFuE$l^Lrk3j1x{ULT0լö,|2 e"Ɠ%Ju`ѩDk4#A2( hɸV=9oa#RqGx`S Ћ^USWVEcX!䒀t h,$,IEtˉ: SP3-wARQO=2bvqnZiHYF_|qh֐('>䵂MeA䉿DX~|+m4oԅ/|CӇy|IC0]_~Oǰ2gg񟮧7fHd=]_p~O֊r=}?H0I<0ҥgJ2~G|CAhc`J 6 톮;_3'y9-XҵP -:{x]۟Zb0'ӢY[`qŋZ\̋ wfwt? ^g[{rt8Jm/㇈ti+E{ivj5e}8NQfA"$뽎Nыe)#)E˅)JDyD*тŜkU|tit&&rrPFA9@LGfRIkKgL9ipՙUfE[5MårFn9mK?sqgI-?7^𑼹rJDD.Ӥ3\ǪRND< {>2.}Sb0ί82R2_\A?gXǞ`LprJ,g\08K"8]xBF,l56^_8$zcɻĬR׶]5g7фtE}yQ('}qbtڐffo3z%|+*.TrvhpA m~⊖4xRYzݝeW~u?OP_fr5"8[Ì0n?WӹanE$jn&!wW?4dGB~MaĖZ1,XVAw7Ød G}Cnuӳ !N4ב2$~0 ͘W s~;!;VE"X? _~w}ị.{ႃxw~V LR(60/e/bd ?~}h M M͇wZ&z6Ọk| Qbm7ܯ ?Pͳ6GjW-~ۤӽf$j_J1uM! 2 )._yREc&Wd8U r +o&ZJQi u&\v$9Eeu)"#!/݄jdꨎW4&oxKN>tKR ئNz4hުpRh^c/ȺZy¨:+$z(l?:gm}|aE>}4!˰^pEYZ|c2V7Q>HW=V.3fu^(ð>,G TZen]AI$%y#]J0A#T"j+! h3"qX؆bcyh1DL9Zχ팉Ǐ]?lE{nVXUzFzOom0EVصKPNGOȔJ"Ьe;\YKIVg>qMC|rM^ ljXci ^ހ~uyկUڶVi}i~EYgi J6w3s= 8j *T(|'z{$ZͺY[V =o`^^޼F-zq-2h rީ)\of&8ZKOrȧJ 0ZPf + #t^3\(Qɇu&6!-ɉK]f?o :. 0 A%}m&ƴ 1Wh'*PI GPGCy F贕 N9DʠИ AM!m,|V3hxo_}"ǪZ [2z:[|Lv = \[tJ舐1Qm,ayn=4*Bv)8CTPY!Jmd:!{D9B(r":!`ܢ9 Wfm"^Z !+MtI6;tsh`7ɪ=43ʡSVKQ0Dl .2/CaBZx gATօ=q!S6OAe3Ű ūW.6)hym4Ѽ6Fh^ёeʞP,b s2- ҞJ­"fY_a,_n>Cn>Cn>{]ij쏧 2ezTX4wBXyTȮ,^sBՃlWTGU3 g"iI w10nLəPjĠHQV[$ ]-Yc/YmxW%Dthէ[Lp >ZJS@u}nx12k;RĀd45b_~gB53g_$vvu-O&um^O׉Ck >@{Z`D62u)m!@*g+R~Zݎ7R(fW)Sar\5-:ayay՞yV!{+ȐL(cIT)dAYUT\c*Ĉ:eyf--@jt2#`쐻tЪ1vZM ,V%_߮ t*8"Sc _{P17_:6O~сY!gL !F<Zׂ xUFwނNiLbx918T]tֿy%FpPqZH xiyb ఊFaEƕUA$+c&[L}tSn z41zwm  h'AaYa繗C$>8-$k"%cmFOݺkbaԧ[ON/;&<4 5Dp-S,@&9(VLjw˘Fʰ""Y^00TZ_D.'q,t:H@PV2ƒy ڒ2bHk$ѷ`+7+WOjopYȭVc;Yfǽ!7 QƙlkƟo71e:!j=C,h3}a"W֕r {ͺ7k93J׃[Wl_JT3 4йr؛fp>v>~]oٿqVpvWGԉ*OQ/vZ/SD}Evd&C ]./+7Q\2Mđpzz\Νġ0LŸ|)y yMU avB&'k&hI_JRo"iwY}AVh9ޏ] FwuRh`8Maf"y*)7Q?")}Ipdq`~-l9ꅀ<aY9-eWFkqwI*Mnݽ7601cQW{fa-nhUS'h.߳' pԽzvvY n*vG[&hv}_ btOMygz3$sw[ʳkϖf taAyiddX4J==PKJĘV JU$x-'Jy1%F8 T,#ǼZ KuɻXH>XHQ gP)R_mƑbQ2R(A[i͂af^/q m,PrSJf;x=:s= 4='8%GąGoNd劣T$?>ڞD~('~FnOXaWSi?VUۏ.Ǿc:TٶH5~4T 1TIx*'B%!uCۗX=#jك)igބVR'.%g =JoN./V'}p ~7b)d&fE\xoIyJ*Ax2<;w96T6un{6m'ն-mBm$$7^[ar:[eۓZ?b fwm_ew+@>tMqEbla1ȒWã_zYlU$9$g Yt.Fv ,r`]WgۏӢRsϐ@ ڬܝ~`í=2EstwDU`CLLIѓ əeNNm]7w,WbYpH%p|Xܑs*HK|(SŌDgϞqd5 EvY姄^DɍRIkR0XقFz6JsG:j|mⶽc̊"!CDQhH$dQ 6jRI%/,uHj\^'Q/Adz>܃P)3))jݵgqzБ~C6o(cxSשinIf$vF|wBɖ짓Ώ8[:ϝAVj<o#>`2j>2ͭqJ[~3@Wѧn[iq_84K2n@;|W"k"`!&d)ȓĐN=3l?I`烏g?5G7_q87DlҠ B3^.2g"sɠ)eh;8rSR\f\GXѠaOOGSpXr嚏JΎ,_]iQ '|cI2'|6ܓcɁKT,De6FWޘeoA@(711iIE'rBanT<19Jx5]/n8ױlWW7~T7aǂOrZn':ON,lz4l҃32[d5@q=ȃZ#c<ˀsI3dՉQn lxpβ0מJ(n? 3!^ju|NoP|Θ+[gq'o$Z{wk_?_oWh_t=j{X̒4;z0> ?p_qAs&N-h[l|="AZfoHk㟝Q&FGP}zΐHVrT1yeL8$$Ι+2Ypk%FtZu}{0ky{#.>eIq.!;|4#-:'\VLgfi{&z͏%h“[C'\i|hC%glJέ1ML^/nFu4y}ll1Ny>_~<̭ts{QDr:ߝ|r{ -I=Znnd$ VG Y,;߻MywbpR lRJs-ӣr&3)+b mWk~;ɡ=6 |UxIcq'v/ΈN}o˿>w??\ww״f`-#lnKO;wDY]kVs ߤ_[sK] >zn nﭭIk/_7Q,eIfEӟrtkU-~WlIvwԙ RTy-U<*, I+MctM<]VT3ȹNhQ}z1 ?]v0 A~^EȜ"kƉ-uD6괆Ĕ}w:q&_QJq&ӄZsӊ<+ƌ&IZ>KZHm 9  %vҊId%Bp|yq(%vV%vIvX('K a-xJH{1.~Q{4/뫳 QW(R/N70j& ߠ# qߨ,Uأ٦=W1ffB+C;a&ːKP(x!lS -N {gY$Gڅ11"*vDDقE,B0pa29]-,ʁ7#̋ J(d1>{v%>m-Y A鬫j~3 5qpFR24o h|f ,C͖:k/JEs@A^,m mq0 9(K}ᨴљ;_P)$B ,P A&ATېg Aԃ@o%rXI9b8ruHl)G;a)2d .S/gwna{Lm:B- Yg"%zIğ2Y%iɵ<2<GU6xqܣoѩw9 FaNn*UQ )A%9%x!zr\/egVVU[lSLJ.+YH-i2YK#˟0^FlPIry~2Tth͇W{' ]q7r{wFv!]S9xzf'a'`2% șo5^lUAKm=^$lFzHB޶喿;9=|5g/r~K%#oZmGמ_{+b=_Uxt欫6{ۚ*񔗖v;foûu<9 HNh-Sv"`NZ9`R NEoS >^I9;8SUqy"U7ah#He7HJe67NH֐m04 |,>yXH^lg[=oI˟4Kp-!m!  FX9JK$PcJ1DmE)_ 8MPk#9XP9qUѡS%Ti N.!/ٔZv|)>(RgFaP;*1%`$ěy/#`tZ[%V1| ڒ]`hO4+1a":K.@q ,SuwHϲX 'FFnܞK=fNRI"u@ɼ9%n +J>)RxZ40)sxRT2Af2AH))@vG)* S (•;!*ZJi}(!F*'I^D#xZՠVvN5?h4&09=29eufNqQiuTUA/rpoMYN iȳvDiR,ZHG3b4%B2i8JGHEVEazt(WE>][oG+pM2Rw]s bֆW1ErIJ^DRjD$#ΰ._uWW5|}uP,.DhJ6ADi 벺hkգ9XIȮ kEqvNRǕ𖢼5*DN@'$&ϓl"@xO&Ar==ZxORx <#SN#"Zjv(*w$'zUR\@snJC2WHu0*塘L}7W%ݚc$2W:?sp(*SkȾ+.iՋ1WFR|{Qf+؝XŒ|͖f紴17rE7C`yOB(RkedV';z2ԡ<,y*e8L∐Z {_t(SZG2b& 51,,?L?|~]gZ:1i]F'j^I:I}x4kNѠ̶_@޲"=LgiUtr5͵?GhZNߧj|EwN,~`N>Hr6UU\k 1q",p¹xyDS?,>pXyfv}߭Sm4 QβIYkEΜ1Jmv&xwp:9PĒ%aڜ͝S0#W#-x\3Z >Wd:&߶F[qVh+n7ړdmōVsۊ7ڊmōF[qVh+nqR[qVh+n7Zmk+n7ڊmōF[qUPqd#w5-d׆PL^r_)מ_(y0-N#cg-_|S@^"@ eQx/\x)~48&G_8GGg7PjzN}<̈uMX^x}99}fm N\?~:.- ߜ Iʙu$O,P|ŗc_WO1 X9((O/w|wx~_<{}o_1W]IA?]MLͧfטZ&z:&b o`df.<( +RA21Jn7لګ|d+We+s:gYmm[Iw8Bg%z?rzZVJx1FN/%+}OwV@|ܽ-_? f)͓qNK.2Y3~ t1\3Tc(Y;2JH"5- ơυ)лA@kq[LQ EEB /l"P8xD)V=iUnW4:;/hwyh :8&5f) !'jlFI*绣{s9,l88L{#L9v!Ă2M-h5T8V[4;a95wF? {s Gݫ|oٻC+]P{jcd)w'xχ%\p2iU@dA ƉdTZO `oHN;kd4Jc,qjk%A#TaT*DBBȹ[7⬯@G$Uk,~=^-Ze4YSA喸,'n~Q}0enfuoᘏ$!IfItx&Y0jQh o.=3~!jéIzfaALW J`y:KRwˋenH-"WP5U2ނ֗ ZsSe<5霘\B xB0/@1ju"ǁȝ$hi QKUm"ӉS rw~}[X¨}7z(w5yƒs.Q{(e|.Vʄ @ D(wC~Q,B0 ΃ Nu( 1i;i`o*U\IW1@c(jR *'),9ExrA3xIr!Y$k-PMxR[i.]""s!&T8@^gưͮ>SV̥nسz6k 0O/_۞rywy6FvuTcaE'iNOE`t<!N~n|GMvƟr?w g~'!~f)rǝ\eҎ*Aǥ_~g_"l"Bf8h*1Ç4c&ގjߗ1k>g[NJ$A[CKf^"8] r-3_MFŎhpɶy/]b]Tތ濶?T-N+{q ;MaZV=Kܼ"`v77- d WĮfQ\FSD}; A.2+0Whr*W'b\}XqGvq|k^zMH%X;-q`Uj8f~o9q3  aC<~y4Ǻ/մp:\Cѷ&e,SY[jCɖzWԴn: h$jzRͽclKmM`cJ,P"^xQbcVۧזΰMm=fT5D.ud6hamoXwS֢bB0bF}-48ۼ g6hAQ"!JJ2$ُq=ha\DK'f#ELg'ߞsh!^^_vah]Z@J\JŅAp:W?Mp^ oyJ}'J,I)Ms&$rs\dp>3ǽk,΄yo۬ F/DQE[#!^{lMDtm6O /'^[2|Pc.^<氳|q?7XsV.X^_! $'RD2JM#4GUSQ+ɚYN8= Ty0I*R!(A-y $V6jqki'7Wx=dВD,pS5L10yh4in)gST`Vúeuo`1EEnfv?f7 {Kh 6*ɎjoFe)n? JTEQD+mn@<2^am)Ocߞg=}խCvKyq?1q|\1^Q4NScpjJU )CzJ@mrpk=43<r˕|:Lח_ `<k ʠ*(QƎͯBSQ 5PDpJK+%%0j[(n8;TN{<{^ٳ d%ysr^µv-vƶX# 4QH.3ܓ$9k R I Ɋ+J *#Z\9$$NBq^ q,tJ`U Jn)`Tid,&vdR^Ɠ bXcA"얱Ql0#7k+"̷%eAeGɧˎVR{5%5OÐ?12lr'!(fP@T6`d1LG8MaE<.6;Em^y=x:ш`Κ` yOQmM& tPΌ6V![Ae(&jwZ Cr!kB$^ȴHF\yZG( a1qa/u30 """FDGM3^*YA8t6"PC@ QĨtOT,!* RQ&IEyPRG."x4ŭPu)q#¼NYl싋#[œU]`QlIӤQw}U_s^{ջ{@Dq0Qh2A>լD׆Ha*EHDV?lsޗ@|>{b?3~gNSKbV #"1R%B1Z$e8Q=2Eνec I2& O5Zk":jY&]Ϊ ~lK:qZהS\rS>⡄lyf8C5^Rj;W%Y(hXѓGHh&}Qz~38""! Rg-ךROM$]TwItOZ#U!2{̾^MlK"icѹOxqtG[J wix[eeWZndC9y9Sv`Q?<<k[g;ȶ;_)`Yy1Th4^$ĽzS˘9^݁F8.p; v$(H9wX#+cCkM3N'Sc3]ό;]*F+ EZl{BLK3|Έ+!.%bci]Qm$LQqz*y*#!QNJå iBA0d XlCќgIC@z#5۷j;ªu䄢5[.sm=cڦU:^rڭҢͱgG/Ď+bQ |zפW7">Yς<ȱPbLBsnd;9!gs",j@ѢHJ0"K*1IZg)I ecMsI<ESL1.5'ZZFb MCaqGpNǥ8.]'0i$/WQ,"&#k @i5 C0xikFRlJ!虅*h $N*mQv<*Zd|:^c?d~rd4:ǁtW5m @Ld YL@Bp/&zΩu Q`$â6r]Ԧ9=1q]6 <3!Ey:ڼ6b2uW~77pXۜjzhp}9RISy2]~ŃFW5* `[#!HaRϝ A6h{Qn@U⋾^Ԣc?JL $RTZ.2.d!Tn}^h q&.hǤ@##/I)NRmMOpM)rb" @Ґ(6)GV pX'1E? O'4^Mnm c(dhُvF 5v[Ih]V^;l]ٻܡwEa1t#;_/WtNp}6Oٱts9=?ޣ@sPƭwɶIط{KRiǟBt dzWO OkmX cB'g I9ЗjEGŎo̐d IQMm3l{Uu]:Ϻqk/w-ĮŨy!vܼ޵Bʹ}Z+滜SnO7p|ݿ}mf1lsT6fճ?tFymL5MWnkg~1Chb|W?[wÍ?+=6Ҵca6uVE߬wBPPΆSթkTz]jl~dV%3G#F71vywnG+.>j{Iji@wl^]^ objM^9:r_Z޼S6DbAC㓛w u;>Sb!EԔh D+fJT*3BkjYpnMbxl/rf:?PC eHz692` 36$PĨQ]''by{oQ] cѪpǨ3^{R j2dսa~E۷=OLQBv2FoR~=pX]PP+ST%\ R:V^TgZnFjdbo^Cg !@V'S.yg+o;{ Q@ }0`賦 CHb 1fPD4t$M HDH^"2) 4$(u[0Ȉg GYX#g>jVǡQֈ׈F T̾XA6;0 2Ҡ C@@AbT'Zi?MjW 6)MyԂg|E3f =i&4w|g@"gF۵8zq*>Xg1.9T/.>9D_^ĆD5:pJ[Ea="LM,$O$ӌ({x x*wUa}x`:T؎MOF>cΌlR$=t'J/u2M>ռD^0(i+MB $Un>)z,'TOX|P 5sRG$pAJgR TxRPs))\?"IRzIPDRBHZcL@ƿ SS:bFǨ*|z n7ڞ~Q]~2}=#Ndr>zDSjc! w>-u;aEᲚOJATt P*1`$VVKQcМE] Zm)Lg]mc=b'{_9Ԉqc( DޢBM܃  ڨ<QJOu g]54 훦AϿx ?B{;|7nrmyףo/#ikT->_5 /W 2rөyt@6X:fPR=23z4HP uve2+ϙ%LO`eT!(uP`n,:BP/M*$:Kru$q*r&S}w2iP'&^ޒcVGsMww뛯;6g\8BStL%D"rBYyƹw #ġL`Y/EEo/Av]mnG<^W >sn]3zcڿid~Z󊣜HK7-̸=: A$B*fvcF2$̐@ .= 렌BPQ\29X[B,cj/NiN #ghCObMDrO&"ThS`n3&:w|C>bK@2_گ9e8Q\/OF AhQJ1jq܄i-8A@Jɴ"1z-&Y'$rΪlBM!+!`eҒ'kGb><23p"ZHGC[̙"#5"'&,3%\4dQ'rV=嬖?^Ռ:9ƨx1di3GV sfmΑ3D$~ !$Gq R^JCO?1 TF9/PU[tt,h-r DjZa1OZ iVyG5rk&ghNJ*F%b% ଖ(8XNOfH)8h,RrzAx3y>%@M"Mґ`DDM$4"R0"-gPv(Ofvy19Tha[.c&2qQG\% r,Fz cdTAQN<`:8P` r kp|oYg-Us#å9FU+rV:6}ID^,TsP]Ӵu5i^"9mmlel0 bn?gXۥ"B3ߟ/n/cFlHJ|0b0rÄbQe${W-w&'^;`GMrݨ* ;82&"/GĿrUxGë >tN κ*vgw'o^Mݫ?ɫ7Np'W$uA+0XR5ai5 o_55_Z547Z:І9o3ksk]A>n!粵! )?T^~yO?uGEm4#_HME*캟4{&2?*NhM ]a!t)΅'E,/m*|Sg+T=ȹRNia@2r7?Y{U?;ѡ@뤎ċ~t`0'Al6JdA,h&g'AZs6ӂFo}:2;xqv+!I:"!|"ɗB!r EM RF1%DGV ՊjW>ƤKP $ᡆfu&q9GվGB{z}}BU ZZ+B|5o0t_u)كľy ȥHlt٢sYl0ڒ$$JJQC٥d,fJ0B>@6@@F-X䁃s.f_-"ꭁsrt }HNiap 2hAI[/ `Te5og$:Ѝ%=Qe%2^T3u`YbVCC])VgwX݉buKATsY#WrX9%@Je!RjB 1JUҘ܅>dA*A RMD#=yb;в8[ _.)r$OS nOlMͶZublfpշT]hKջBѕ d4&Oɍ<2<] u.x0=Jyy. GprKVxPHj[n ^g}t\uCMVi튃eDѤ=VL\vH]`Il$DQJ#!KYrQZH ;XdI>Zi֔luRn{xc;XChovHDk׊x,u}h׃ŧp~kh \(5IȜ" P !4d^q@: ˔d%d4lXRx9:8=)nM y"8|BOڸޛqZv=mAY>xi,h!{î `]sr~=*P*~`W? r'vU^Ⱥ쪀]`URՏȮQzAs]4>-j/F-az:{T./` >\Ncs&Xa4g' FU)t%bTGGxnɺc̙O~/A={p9c^rf7]S%=x;obv)W~$;d<|j)3>(a"*ɘ9@4Kdσ1ER\]F¾4SN5Ɇ2SqIQU>݊sx{zmA͔R ~Gލ6+:| ׻9=dek$&ta;hW7~qMDL&GПgۏyBV*fl0A6,| _[NpPǞQƲf0\ЄE ["pk2P%L.*mD_CBc.hYPntakR&ihN[$ӞUUdPBD_u5…1r$ 6`ڙ>Ľx,Wa:8fLJy($h:gI$q%2?^`]Jk>}̃ng^jtÀUFUϰ_]qJ&,a SUߦEnx{3^xOsMnq.٣KX2; T>eohk,k[XQz)|*w\2.r0Z)3r| |$$ZkM$A;+ޑ{nW3#ѧ`XJJhۘtdE;CN-yڭ^D4: \p%\ ːRE&fN%9ѠAo%ΖӘO:CI1&aj[yIif=l0^?ߢ!FW17Kwm,1S1{GS)}׃smhz8+-A Gl C_]^?Qߘ>5u3Һ=-gtG6wͫtH-kjYOun7-ﭞ|zEK-7Cb -mD1<^:}Zn\\㞤'L:%q%h FYȬRZy05k`%`W4k/-k0ȃ9#S!Pd,Z%1$\)IdrP ӎf }vUwws%A{ma{дH'U9rnJJ$k d2jz\U^z[0ţxz=7NQ` cP>0$cH,$! )$±9qg6S-}n 6gRc2^er0D&[@۠UPXƶ(Cw lN8ɗn*]XJr;TF^{U\0V1*P:j1% I 7ĉ #㴰T(5ʭof[ VPΎ Y*K2OTnS؈sGFdRfnWBz=b) 0Nܩ Z LEN1ũlZgK:۽rJm@Tz+Vy3,ᴡe`R dg)j_)dR! x$& |^ʨO*t+I=Jk!J#*&/I.2VkVvNQuW4K0ZHfZ%8뒋4 I WLrV;h3jу ة*ͭwRy o#,^ҴLiZ{J$!GaFK\JYlRn'"-lH> Ѯ_k(P-6Z,!9ij',aGtdˉ:W )y̕y.PGkP-SA/GEF[OM<îݥkQE&Ol™FQN7N9 %AkzN>tsqZM~Yf ċ[q,5FYx{x`޿]M-Sskx%x=~AN[!tl:ZM,Il,J7{=L0!J턶4 "%IZJQ.H$$sՒ>Lh -|2#6۶\-hA^?=)XJ0__>a|ύc#^UC0*ʾ}ph .Ѹ7ĭZѐoOMtұ.Kg$BdJMյ9M[W+B&O6^]Ϧ_^VV-sٻ6dW. 9NY`sX'؇!tbL\c/ODY$eI LqF35=]_U]ۋ w3E4Ն?]$W~x=4F2NTrm`mĆ=Y^"4E#XFr7_gN̛Q{W`|uw]BsӒ#:Rnh0/_W6^7'o9bp/4_/?_{߿>`_?~KΟhf)R D5G 3~zԺ5oykanYͧ^ Ckk>ޣ>fn ~^[[M匿߿0?.6G~jU/0OHlRP KRpW*[Kb7"/yxIYomU6q'w9u3h>uc*$Ed10Zq;nNk RY28gꜭ:9dYΨ;wdGոyzd" e d4 y0ݤ ErNQ;SȈdJ6lBUzE9L+vي:P;]B]xӄު.и;u"xq⨍  u`(]W1^o3)@Lv[`IfJi2KITJs@ƈQ0 #Ry5|U8EFB!1lGqMiI$Xȝ6d@4?rD䨚8]zH>o43衰bmOmZuE>OD}T٭ߌ· Z:nН-=f3tFbo$l|f ˲4^u fuҬQ:sJs5 6Y%.dɠљ;_X)X722D"j- $JcYDz [x;2C&OF<&hvxn{r W9W 8Eo.ؒpu&q] ڛHSDdZp10h;9}Tֻc, Y*h ٮgǥ_]^y}dkU6żwz,Ȳٽ9_Em< $d-\m#Ur f!ZlF#Βܓ%j֔gmUzw( fl?qy>t{~ڡ"ٱ^+vݶ }1VNeOrb54k,$dNM@  WV80)<`QIt$Xcߏ$'>|ߔyL'2a5tFYl}]hz;5ƌWI8oɢL5oPK%JFp,J"""@>Zm7N+%CDV #%)% #h %U{k<\؎skJC'Ѻ&s7[7/?\'ں8vi] @TgG-X6dǗ Q?F4 Ѷ 8 B8:n#t,1BbQE酰q'-QXHU662ILMf Y{csCT"EU"' .pARE j/v XZ8s\>&fvcaBEira&>[ݭY䥀VN'_*!@&H} 2+Y^ˠܜ瞛󈝙~s;~_սs*QS[vsr%D\ :a! @Dē [4sU(@,9iqbF"EBJ܇%]Yע"܂=e.qS~Voů6|EJ3RsB{zҪy6xVm֣0s=Ml_?4d7s۞R=V0X}Ќqfid .Fr)YZ_ YeHQ쟧1)`E[Fd(XR&D3X w8]C?Gɲ'f5xr'ӻ!\z|fr y(MORMAO3@J*f7&@w<{ndA5)Ev!)SLI"kTa6Ԏ%vMo'm\X`?RdXTNuW_wym+5 T^iN}0\rMS{'ݫ۽{E9Qj,:XւZ-,ZT Ԛu+D^̍uhkrYi1 =dIL!mқ5sR8ۑq3Ky,63vBux{kʳͮp#w~n<38 ~BчAaFlol*m^DZh|4DHbD"#`e2>pxQ_"KbL B%j 8=f]t: ٜ*` Sx2'ꦻ`H Hg0b .慳@4H#2媉?\$zH8muLjdW\TqQu/ L!2 X+=e%HN }gG>pXue<1sv&Ut.5r7$&MDuOC8C\Nk$V1ܛFulL&q/+-2_Kֻ?0BQ8d=TeHřHFDd :D$qTJF-2(Cs.;z[B7+}L>btB7P6%{/wmsI*(QL˹{+U퐆nZ/ jRS:4p d\aoF5LE4-]׽en80\ ,q_+Pٍe62yѠhЧTI4Pn4<y%2d&*Lyż"%;!`fne篷8 adtdAYV5ܜ|yhw}Rl.gO'č)PIAit < .Q+'=`2:;QOo ΩҷʺJ-CG P6ն]@:Y/>h.͞9KGa1Z .hV@ c51Xi=pdz`a| ^jNIA&Kϯqh`u]m<ͮwqx>8.}P;;Sa꿽Vhl8Rћۋ̯o3Ѡu: czJ>s? 0{ z왔oz}ɽWeQC/?$1s; %zWWf&84 W5=kNa 8zDe8z]MӫeYsok߾;kAeÕ$)Y!=uą|~8xOApQ="[K9 F"l%W~MZWs=Xl gt9yubJ~-f$3j0;_\͵{`}h8!Ǐ||zTʷXĦw@0O-`HXVӗ7rҥAy\o;wmH_˵@&9| i>-;܇[R%jWeȔ-; q]CrpjV=:ѫ u ՇdT7}by[j^Ӓлף,u:uQ26}euXSP24=l[}}яg]_ug.~'!O%h4 7s8ME>ܔs3T('( abf.LBcn1OW om_ͶyÍetG')كR.?pn t^1p'υGy^ZHbU. He M6OІw4dz)FoIW!|!UKVMrŲgAJfj}ЮCl_ +>l{GBH-(wAs yAОHCbP/{;{iHv3F0Z=\`*#帙2#HQQ8lG.O.,R2]ܲGAx5  Ғ&i a<&hL:!H s\f%V9?).X+|FvAԻ̥ šTP6@-R$J M\"0 ^<ʉ-/t}%:AgYz{}_W>3x9Z Hc)6NxI} @^UQo &/j>@mPA1)m~ BJ[D Q1TrƓ PKAh&v#e 's#vhOBwA!F1ϏLiDzi^}TB 6X)4i';b *~IT? ’1@B|USxx ,,oH4L(ϕf{'mF+4FQ"EADo sf(d8͵uNS*?,g 14ir1 &HY™ԴWDoޜί&.s{y:.P@ЦkBJ՜Շ15og7#~uX<~z~*8{<_OQ_,ܮ8F\K.h9q-''A:{ឮa2˯(!3g1kmë|^k;Ȯ^ Vg|le$w,}~O43ܿ {O QN75*'hl8Z<׳}׷g?}ew~-j ̣ϩHX}Iǣ{]ϻ_vZR]cmנ]KRr̈́ү.9~}kv藺 _?3˼޴ǠO>t՚h3( B$jqy8֩Dݮ CuݏrZ^16Hwe?g]c_%gmEԑC #j^u~jg?ǓeuR@, kr 8BQ^OdJ A!Y7;GxMyh"#4lN*@20V{r\4*geJBUr1x%EتA(0`oR(_ M[|*ދ, "wW6V9x.f= I:uDQtp 5mA)悬^H%56$HQ=UĹHp(Bd9RrО3bTm p1A-gjT8V[vLaXSg}5?@TGʡ>L4Dv= _[ʇ^/zJ7mjX)Xg.LH ThAZ?!hHz'++Pvh(Xj#pjk%F:BTx{ *,.W ǹ^]:\>{՗JT*abQ;ۮ-,m*sޮM8]&3>-`I$x+ɝr Bu=(,ɃV{d:kwEiq_W0/W`oQ;tC$"9kn\s~7v7樂)\Mof >F)"a<&M94(#4a",`Ap2 ku($/oJ &30 J5Q 6(DrR0S kf*95 ^Ç18Gʊyպ✈Em+'ZDB(:Q>YDcAK2kiHT䃓ҀO-c&ˡ.0иN ]n_-KH!_/<`5fo7?37nU1L7,i6VBK >Yς8ȱPbLB>XB4i h pMSʯHJN"K*1@$ɳG`C, MjG*TG#u D!\牖['AB* D\DŽ ~pǕ]^y91ǬIƙOМaUOpq,0*3UĊ/^1V&u]5X)ns_͙ Y혢UYq)/pS@H<" +j{Gs)pڣJF BE*%nܙd7ysqXY z(fuWY=5d(zN|}L$lA$wG3.Nnn Q,m]_)v5 YmXLvT^d[ϿJTMQ5^$Y \.9T} -`}<NXRg R%#) HZ#y1Vpe10C7xE QS1Hv:+Ǒ}oS#dDY &0pRn1sn#y$;%ɹig>y*\nXĹ.#Heq*љQ*,P6F:*Q`ͯB੏ܡ(`iͩt"8JƳY i)n8,Z2}S1!bd'!/R 0rٗ~{o!y=\5~7iVS{ClוDǷ" 4QLhg'I2 3ӔKbVHUNJ\5Sqx"&/#J!$ЃBdN[YJ A423qd,Uaa#-@{,+X*3 ;d/_fx?|j;MG'1maQK&`BbBͣJN3k=u 6a%!{6`0Ȱɤ;ɃblGIMF&.q]ʜMa-<.6;vEmVY=9r/"`̚@RD[=OZSB*3UV0+I`Ȑ@fH9ZYT5!/!cĺD@:*[Z9aԯf`<D,6?vED^y="nmQHLX,'٨SԖ@"@ ty۝( !lA "8 ΨIRQq瑋(i,c$ \1(|qy#jK:bdW\qQψ>( h-a$r+qFD2r'G(6ǂfǮx( {} ń_ylGw;zOH_'Q-px0n(˦Dm׆HajE9#IHh_wx6 vDy(R]+TO$KH Fhu\(ν GS,ﷳ*"t}^ җfA^|}X'N[Y䣧D1A"-2̚3\nLUؕW~ loahƸ" mc0 E7cql:*!Pm;W'PsFBmw4cBeۀ=b.qhD5|*nVHzWM {S,Ɲ 7;ܿ?v%hj44Tq\q\׈!O"N_Enq~/OzFγu:boQzTMy3ōzDzm#eu)#W"{xMٯvTL3Zk-[KJ@hDT'z}C$ONyX$aa*gXXR+4:ɤe߯tblrd[;綕Z!݀f޽oWq2ϸ%4 *%)!,#B crJCC+shp;o]samZ Cq~=wXH~7(8JGiR=!:hkWRk ,S@8gާrN^ө^|OsgSSWAЖK ! P: r%Yhi QK!.Jo@z@jEoNċeK2ER[ +"@¸9a(D(% KMzD "乼=0 @ :s; 3b4xe0~#F#a7.ηsLޟ?Y>gkGl?r?nEx^~@_x29Ub,CmMpţJ&#Ez)<<гzzykc^;bbrjrbw"zXBGt/dQRy7TRAB5ҙZxdD zaP2h{Qo7@@qo.O%C&C,RT: ā?B|g^xn&"8si]F#/I)NRՄ'-)Ɵ{OwKnv82|2[m+d&3 ~{;\ǶpvQRl:6x9NSޏQ ` qWdl0sE[Ϩ=⼟~%󉟌'_>nliQhnC)9Nĉ\786/jWsˏģ|muEh:Vr\STWWP}MDyTME25wZFQY9Mi<՟f7|ҷ˜%6駿yU$ "$m BV92n\rׯbQp$T/(%#EF/J>qg_]LKqj6]-䥆-*r:CMfrzcf0j4?]͵E骵{U}Y>9|/|}Xs0&ɧ/K䨚/cڷ\-a [Wfj[;Tdo`cQ]0PlZ; $TRp^TooYM=[O\Yثqv6A1ʆh/?=ųN٭?Qbu7gfQAas[f?R۾n#<(JT$DI)SJ29\;E x˲N`#wel7GE^QGiw輁2Z0d # JENqAɃe W(zytV7q~5=vƮ yIYuA^xXaU{R.z?~iEiL`.dg􁳹vET>p^k%U6X(t;\e+eW ӂ@*,tg*dW ;\e+y7'S_?|@k+5uYVRrdpeW]o=+4s \es \{/l%!\g֝+4WS2upwW )`l&+4WUU#\N%[oB;sx_ q6/2CXkΉ]Զv#2-bz]Ns%V54$bRsxT&L`\ iE:x,\_ȹvq6.}4O4ϧ#N)W8;jlg_>ZqOCԙ9k 2 ͘Q<}PS뽖-)b15\W$aN"$(&|V!%)l%ILZ{-v9ZOu42QLkEphiFAT"3)Z-.341tE־ZK7Cuf)Supm]$端WHf^ߴoZ7M}i}ߴoZ!Jљ*4WJrZ+<*Rs'W*.Ҳv=ޯU l5(lj% ! P: r%@4ZbRZؗ 拙4z欯q!K($*^ ?`y%/.duNݜ9.Q{(͒sʄB kDI"NwCʥ&="\^Nz`x`|SAN 1iBHtr2y_˷w?*a8VyG, S.2e6<@+":3"TLfCF!zrw PC1hD<$<(8:ō^*y O֏?[O!x{BHʻ D# 2) 3'eDr^m;}F}W]2K%@<;rקuKoƹЃ.hGN#hލ$ R'^smj“Mχ]Ow'7 pj,OS?jrd&3 ~{;\6 jт:t8>mrH 'JA@T8<;w)b8`<Q5i{y??{Ƒl O/"w7G"uz(Y|jJ=0dQaOqtWWQmf8+gg?%=gHo9tu?&(g7seO4)kM쏲^ѵ$)**= w\ߌIɇ<8ap8 [K9+DRcKZM5[OP?`]>*[f /.gyWb&b7Wh04ÅDM9\[ f? QlN,@D饧7䤡!F*ىrqCgKbO4ή+`2)@/ ~zE伫ܝsbm>5E|(ǵO|U0ɋ>7֏ ?nD/.Yӎ4}Ew+1jܔvMItKHq oꧽ\u~}i||׌sp6,Tn2~o_~*tiM'&~& MX$?o)t,3H"-~o(7НGiY`Í",z^kthc.rr~ n,&t>rgVU蜴.F,!cکC=\hIlԝzY#EvGxvKj'=2􂁂d83Ș:D#9tI8R6DGle?+Ser}zbYD"V! p<ǯ;U(wY`NC"B|e6[O1bdI AqzEc~Lq6H:]M7M׷}gⒸ/Lܧ DE\x ױAJՇObTJC^L+6C?lb3a2:H&$/7+ .(wTUO*>$2[n"a7ýV@AHDp'͂RRCpY~?1~&};zz{D}Y;Q'!s2J.CpdDb"7"9]*p<[6s=ۡ.Y [?n-=w}1BU!~j|˿]e42KT4,:FXu(\\kĞ~US#ža=HiІϘ&;9)޴ +0N)r[0njeu\DU`CL} sPgr&s9x'SAp=VgOºe0{tۻw_ƪȇ -)DEd0ޘ"6칪[t, H0VJE!!*c[gIF`)Qp>R%㩬VӓCWX]Vwh|+Yr?6 sz;CۑR7PO?麚^yt3g9'#Z2 rZd,5Ep+ P̍\uW3jkrJ9f/;R\!j2) dso0i\j{jJlZq//\cYo=~Eo~wBw|`EHZId%#> 0ǣ,TcK+3m(d'!)۔D AC2‘uYCp:Ge'٥Q {$ُ(rzcM_xH7{uL+M,AN2~oޱRE7 ͓'TYdLFxeJ DqI2Q')Y9GuT )U1T08 Y(X;km$A\{[O5qvcg5}Jz TR\oڔV ~ouZCG !svD+1&~>{3ozݳan400WXdd =[ryz图̫RH~q 얙aXypV 0F;O(!#A*-ZԋSnG.vuXxG}YKYwgoL;^'%0y O~Jᄒڰ\II좕L3IP:[;k5qV-xvցg_- BzoH _@B^`B &nWNǮ"/7,uV]nőjytЭgfh93]\2(3O;?VOoqU?=`{=Q͒-6qXr嚷~[P64!7k *"=<(FzdWz У(ULoL .AB)f,8ƵY#si=iF c,~Dy,ǛkO.n:)#ilԘшH_?I  @lz y>\*Ol2Vs+e_A@(711iEJuL--=t~B#s1Njhiڷ.bj._ohz{c}_5O{- -sps HR859+v4Y# xSnJS{Y-\WէK0.lP>UT7NXz#!-0c,Ͳ؂;r9Ix%.6m@kB<( ̖a{O.TaMipqqؑh}y-FEܵbT5Է)tbl1cwUqwJM^V*`'$k@s8hmjTlf`q$<qTQl̤;L6̺"\S`!YqItBrK#,[ D%RD2L1i_-J. 1'sHk7u& ɇLnʛ4(gXpz)匭RŪT_p\bK X9_1ݒ*Aq5>E mF0 RS23's֖prYnss,Xpog/e bўTѩlκ@E`L #zoLՍeOX 'EFnF+fNX41cYY`6[Dc(wUgO;kZEݦt!li3se t¹dg)*od p o s`JjEz?ei톢;!*zVdQB @UNj:9G`#ӱZՠV {-|t>{ut:cLCQ:klaB&;r&9)2*;zQqLq6|GL@^q7!<@ɓ_D"uVdfcO8WX(XF{UY^ާ\(e,I0t@"6HI(!?C=$xN=羢T3bo_lx,Fs w[ڏdʰlY􌨝 YwH+&<6G.F)Ah&~6gj”, ѐc![*:kE'C@-UC*)[aZVg2G'%x.0qi8&eQ]r[DZ_TLOMU~e%vN*J*/QJ"A|Y IQܼn"(5=ݫWϾ4Pf*ʞ<giVjB՘IgRNgA,5}J4bU*$V. }}w'L8uߓ{^hjq?r&Ӽj!KP͵_ێAY_Uj9-`z,Ԃa~R˶3{Ϸ\6,*//<:xJѕb]S'9%H 흈k*ǽ$xW~l!g ICDmepGd-٢Ҽ;/!< Y}S}8: K"Y[ ~j}K&z(6He^,t0?Iʉ󩝖;C8D )Rj9lߍ>͈>iח5`< ;YNWU¶6j_˻_ٹqe= 5#~asuÄ.;[b.%G͊-Ϧ7_99,n"9*GLkԮ*ELe#m` ij_}6Q!xuw[~)˛'l(MN/o߼7﷯?~$ozX`QR{\K|(? ? ׿}hM} 6C+wڈ>g6]u}Nck]deA;Iu@Jz,8cqstytȚ'Mt5Q `#̏V7"O棴%\EeXt!&ߴqw?+_lMNkhF31X]'X9 *PNd=s49>>]YhTC&UryUz,~hju*R[aB]QwG6 mzekP")83J2|p)v\H9ZK6޶2P%NBrjW'r*jV]t i"b+6 d0]<V^4ꘃ"E$ ľzƇ )-U%2"Vʕ^H5(9f%DrQ*%Ak,\H2 M"(s.Ur+TQopnGWYh9'>3Oespp>MPb$>O$}NI"Cdu txm98VE1Pw%v߶ݢ~v+m.Ge݂HVjm8icM>4d$]QEWź4<1>zC<@R5ϧuDp8\sהzzuMLBɵ.ԱeOV,&$)$To'/u.oxd(+^X]"Ң*ɀbsVg}qZn׮ 6vݞojq}Mj]W*', [3 LuBl{R3l,i=$``n(t:tV]=C֐+VCW C+WW ;VJ-BktuO__oB=)]y'Jg;]C^:e]0jUCW NW@䎮#]^ I]sB\6CѶPJawt ]e/6WQ6e,SVPӣKWM/6p &7gs23z|Y⾾߻]g9xqz2O>/|AY?dOi Of"˧ Bstd{^<1)8ϣ ^ބ5}lzhϟ;Fel81DZ*5N2ܣ-cr*R*9x3!1 C͉VVKlW<_d)qBcV"ՙHWFiC~@t` pC+[wĎ#]v}Htf8QW@kz-{􎮾] No,[6/mǬyd:m3ݥm]cW!I?v֜a:=Naقb종tBw*'i=Q. Fn '!M;F;y[};Օ~tko˿Okm=Zf~+nVj_nj5a\U.vnE6e͆4{?,qBxm򞶹4fe17@W3IJel/ׯ4]u)viALQNTi 6减5$gz}WО}К6Xk㼏x2.C5$dh>hE:e8eZԞˬrGrb,]Q"h%L(CRJ,H!L)+jK\&k']&Ij)7\CZʤٖ*2Bx9y鳵p˰сBQ(500ue ǦOP #XǠ}2Eu>6 w [T%iE BVk $ڢ*D5$WLG@}Mһ>ס(-5Y+LL`5 jYdxOEt1VBd/2wE ^x1:pR7C`v%SKܪJ7Q8dgl7XmQ|q`QBG1 H1DNˌK Q*v8Lt,[Q<{  E NԭJH "3hG[M`Q^,TG'QP'GWwZ& T9%F]VI|'ӻ0Mvn,\wy2u1IWCj)"+tmp4=koGO;U` 0 ɗ ZӤ R=}IQ-ڶ]Uo*wpҧ{6H5A%:]% _>Q&A#TϺێn")۽%XT䨽0%KcE@?9.az-x _3` &t%{NwTZ{69> Р |dU\EoPuF E@iv%&XdT !ݑ8dDYUõ-"PaTB]$1:&Qϲ] A2L'J s n+it'$5$YIdeHm VӥWoU4^";ߨn%BG id_j:hILЋ![[Ե!h\G9h=ۜt =Tx&5l6mu$K;KUMKo=qu*ix$tiѣJ$vC(٢EmcMEѻT$$Iãiׄ fLB9$<ҰɺfĞT2r74J$[tCۚb͑t#nIף6Jr%5 z*2HeFBjPFK[Ƞ rw`-q{=J-+Pa>Ǯ;bEЉ8ij)ڥA,\s;9,Qy/*` 2PҢ#"Hmr3y%#yG;KQǠxO +YUE :#ǀ6oӺfav;5zic&EN3i^&`&ci@UBvZ'D??T赫 #>śB LWSP 6$OxwD6ZS0d¦g@=heR^H,ߟfxJ7 E;s|dɨ8Q{*<5]zB&cIPrmx%iV̯F} Bpi;XoCn:jECr7)D40PrAjI2KH24D҅K)ՒZzih/Hm,?:CoW4j;SP]pB[^4i]ͳb5p97mhNPmІ8}=ӷ?{GBf_8@~}SKGp=Cw=_?Ӵ"ݞ[Z\^y\^5?ɇ׫/,..vT 穬뷺&R\lat\k^o5~2(GNWm5hVpiXӣ\&8k˯g7GjFpVJoզɯiqzl^/ӓ+?4"79s+Ն)td}w8ZOA)0Gרּ-gȶg#zmq.Mp{.SEoiJIOZAS%OA \!p09zA#Bq\ш[2pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \=BOh8 [kp; :a(a%W:"a+b+b+b+b+b+b+b+b+b+b+b+\? p0&g \ ^-[b+b+b+b+b+b+b+b+b+b+b+b/ \y!>$ aW, pf+DiW/+W \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1p5*p9ށzSKw6ð>?. o8*T!ćgC. +l#amP·!J/4a/0x$W8~v,p=hUV?#WE^ ῆYKp,W g~&zXfJ=@c^Ji8 BX.`oP䊢f劢tʕZs@rM#WJ{(rhv.W,W/P4Q$WDks0ri+}+28(Wx] b?iͳ:x>STrZ_A.8Jף_zz~拳2?ǹUyB=vAli×|6iW7v{ѰC٘^/i7&ⳗF4r hs+o6Ԅs:\<p}t|J-f i!uwx4sv~'\5N@??sL}bZ膘jh kQ'Kv}VnJf4}0E!hQٷR[/i:M!,9Tuޚm]zJ!;d.}zMqt, 0C{Oq 1l%U()2z;{tU*D{I{-J)*w'c)PӛTL(8S(`8n20@ɽ(偁80`Z\тsR\Q^\r-\-azr崔_axrEÙԂpQ_E(Z](J~K"koL8 "v]\Q`EkžEyK`h%`Q`5#W9xed%U4޺C"{+ W\QnE=F=^ehd Olz6ySeǴoOޥ|xr>󭥊>^OyFd?w2$B3ZDUmQ|6糳b~_Oi#g&byRôu6;?E}~hO>c]*êm^%*jsZ$U#LcpZӢL"6[mN|ФU95Iuƚ;Wƨjn8>Z Qzo0Eb# X}aѺgeسYswTu/ T_ӛKyW[M?ׅ'iobu5 /m[$DW<*/5Y@޻(7Jl{nMϿirVWk{8z9Y_7: ][pqk{>nzO'ˣnN LkenK RʋGs2ob PrVZ*(>$4s2|zOuܷ5h'cQM[jFaYUJcR5:E {,S>娍x-Glģsw#޸EٓjԆ:#6¹_p[4M1-@q^=i全ֶp_Ow5'~tz{>i;}s3<s!nOow^Lur|_<gjH??~0ucuMTiZ 9"8z=MOuuI* s( v-Q лIU'NRZ:Djle-q<2V3J~)-DD$gcdvup90;ԳO;0/|_}aHjS߱b'ZSvTMCU6T0L{Di(:n(KoJ+M|qv*"@v"ĮLl&N{FV¹[lZR}QjcU[ښU/Q!Bh^Ra7+B*Ý[A27v4`ᐲ3<6Qj*)n+Z^ͧ4zLޔ0"V?fdE4xZ{vm) 7 ˩BҶ⅂ ZWevy្U5p 7eK]Qt&\E\s63Z8w{g(BУOUwyh Xk;޼ |׉j7d1R'dz؄`BGpr`o5ˬ\ے͗~=[>.uu8=OViՉt{cyQ<72)^tRarEIv7 WrՉPu-jYT!tծ!F7m붵gƤ.=?{϶ȍ4&6L6 ${yH/E[;8A[ݲ|iY#mml;UP)L2#k(>\9|,4y?~t 3zn+/=7K[CY,^56=Lqn•JB5lQ -bt@m#JT*$R]ٔ[-?JA#. Qx9si *Փ*cnk׵˄B^$TH^Fu'(#ITwʚN\4=U,_6Ȋ9_&Nfsa{|h#4EtJJPB$,'Hg{'Nq98 ^1=m@pؼMɭrzKY3ww9W+-`<}#?>Z=yrӐIΘQ T UTEC bT Gp*Zp_:_@SD_Uբf?P2tPmٍbǼdORL<pPV/ a`hdƱZ5cۇ 04GBw+~~s: I |I2N2ͯ0ۿ9';#Wz|W?":XK*-w4 dIib4>*>gr,Gt~쭣駟QȪN: (R)#U8GQ.o&<(G7rU:H5\B貘_dQqSQל6/ྀ-QI  8-SR# u w`N'u= OQb5"oQTP÷\4l C]5s%8T5Êm,C%`j8*h_سpjk0%wk(֣k-vwzQ>-H=d_(\<ˈMǣpsjco$bz5d q㒚aP5+;2pi AAW͛{ڮFɆ &rRAIU43l[y&u\H/ j3i?Ki%h4 =p88l)#g,ϑG(aa*TN]Qc st6j{fчn7&vc^wfWa6#fQ-,gs{.6LԚ̈́;^֌ Dގ7gvfsy+p8R\w@1" h|LJ^CYxM- &1|\,ȼ(z ~9,&`Rz™P WbqH ' &1`vkx7kyi"!my߫U"z/T@l饷eb28b^U|SJ!i%!8e< ,mSBm.h%TIņ 㵭w,6xΣ}7_!f;Z{ 5onjaΫg3Iyg wȎump%>+QkTEL:Aq{,`ss_ gUi]Le*z?/*"z=Phv83̸톺Tx<fugׅi=slٻvSV,gvU< {J8)|(E@;WO>_l9n=v7Pw: T1˝'62I[P!aJ F8Do%w h-˥ʓW8xKeL3R)ͩ|`8wXd7'+nuC~/wx2PL΃6ڣ4uaIL!8P\ ͝VF! 9& taGM֢xH̨r{ܽ,zm?SL!*!`eҒ'kGdb Y"$3e EDjD>`B`IhN=YolIg5]ȬcQsG{f%2K8pڨ#1HS#"{MBrz$"HORa['IQ!2yRJGǂ"@TTTv`'wSppƣԨ5rk&G9$%QXIe 88za{z7),gxRրӞSԴTIYDqQt$QFD4# Ly <*Fs0xDz@NY e] "hC|h4̧(W#.+ĸA[% H6:a,LZ,<#2 Cv3u z^eܸ; uU$'ZvFԊD?i`' 6j{I `~xcO(;!e6g4@I(|r5acWX_ë6Mz|8E$!icQRqIKA3 & ιدv|0q\h\8A ts&Q4 t)ZEy_lysxjc[Q~~H:! EHV$C#hI/T]|٢$6YD]5WQ%p("n״Um M- _)mA#c‰d(R8$P\f }*={X&b`xnUAŽ0`)XIP -BxL42@I2+Rrn@'A\nM%#:6_SXҊ;bk(;HMBG80!KH;> O>:WF1T)c]HVJ;im#EAr:!ɤɎc(~:yq%Ֆb|4KW ZIXӏuz8H0$~ӪGS.MKR_VI~xw5c2;NP@vNF  R<3?8`l˦mP9B; Vyk5dzn'LuaLޭV绩Xi1=\3W3fpe_jNttt8URŅʥCb\z7_P!Wo(+NQZAR&;5efďe˷uW-XF.>t^횣Qm,<9y}& k&I:gκa /,o!WALkl͒|:_ ]]9:-6gsVF6ھd׬]9ח4]KHX"|3el+_Cz@V5>V*AOFesFSDǿ7M_y{L9~o p.VsW~>kpc~%kjԊ͂o3;潅|̥Į QX֚8υjU!̺hAӟ|t)5*e+I,b/ӱ"4C\+@܅pkbPMBr˝7?-x}憒R"K.2Yk,rO:h-J:))x$hI)9Jqhy^iicB'"|D3̧+ƭ3[c)ATVL ZN/;{N]w}ҵΑKUC 0q&khȾtc\s1"^*1;f^r''ZƝ=.>p|L`L2iku$L(HB r&oVX`Ҿ`t1lӠim+V#KN2E=dږmRL^-4$E~sϹGLD '$)F#)[㾻4! * ۂ @`d3ӈZcL(ōy8MgphIz0 #ɂYɂsgw'8>->B#t-[/M$q< yJh(H.#,H&0JwԕJCt'"*a D\"x2Bj*TZe:4.ʄSMDd$\",RX0-LoR%x- 10M`3<&!`ر [ɩ8,<\_d 'U>GWkw'6Ǫ͞ʡU$tEL)Sz5e=>3"Đ ڄ(u\9@䉑\ũr1M",5ۂcKeYZJ gR+cg͢FTW*c 5܂x ig=vA?m_C<)?gْgXx7e>uSbހ'2m,PȄB62CL̵ <#yGl]FlJ-] _ t4Q?}ko}vM}]n.eNrn~4h6Qk"<8)RDb4,20S EKJu&i&չ5T!J2h-FhN.:0}81 c~H<HPȋp8yq-ޭ2Ȋ\| ~3-0 }|:(FVA +cP/|+ ۫n\?s7Tkqe/DR~XfMLk'S!:cB! Ը0RB #S)`|W4'[1Q&qA4H)˕\̀g̥Mpe08"v-qrW)Mt&v>7P[Cy\gXXkW۶ǮjzjkWTYWĀ}Zɟ`Sbs r P],M Q^jup( yy92JPl~Nr:?o+_ʹWbugNqySl9 d*}<(ER3P9 S*PĆl1ɢXH&|K_Y, ?$Շ\W(Ľ{Iן: T`d Z?=VxE5tWzqJ#5 |?VlȎg6ђQ9PpVK0'S0p ƭ\-\Ӎ"r>g« /b9{㇕]ie>ϫ6#!< *.BFuY {@t~N,KVC{X5J;XJyO jLÇs`.?*'gCdml"d^֊BS#)'Nax;H_6PNF &\ \<-\Ognn,vClݠf''mMՎ_wsݲ&)0϶|6qͥ㣜s=ġ_.Ƨ )6jjGA(e T0Ùk|O0PJ cu\J:LTWmGR2@ ʶ䯥R oWJn}Zb Z5Qu\J6kK\j007_pjűó+P`]Wҫ +urWV Uv-䀫p%+<%H}g`䊶g-YWTJ-\-zpU  U+PkHq*z+ƥT#\`PP u\WPPeV Pm[T3G\ c kcȵ+!pwq*z+)5#\P.7 *u\WVA\\3j5u*9pC\iW X2 PƺBB!;0P  @-Bl#I?W XPpr7DQT]ԃu|p*:%J(P{-zrYcWʖRul]poSpL-F2opr7 UJ2ઇbAB{+/BwWP6ઇu] P+1_ŽǦ<" x6~k7e$PΨV3$uJْ!L*Nj&HE+G.663"j-]wQ⃋C[OB3rWVΏJA\WpfG\ opr WVu\J6,#UaҧV? \Z!R#%T1pefkU+Pk X.O  W({ նJ\W*NGB_Kތ]ڶvSi+])%Z&-㪞\Ӳ3XK-YWTv1]WzվEψDx+7BF+P+):Pe)Sj;`FW(W3_pj)a]dbUq%T{+l7_pj:Pz+I>YW(X+k5B!~ `jGR+TkTq*#l+z+l!G?S9\Zλ+T)\WC+l?3vzj:P2W3@0]\S Պ;RWWbћS¥TcW}"C-z*L \W=h_y+lm]Sۖ3XO4z+ @z?GڽwWkp\= '<^y+ˤ/BB#~ѽRP:Qj%FrMZ 3RTb2D!E*NVAIj,CC1F[Bz(dU%4"T޸_\l!3.6.v]lh4"W(X_pjMGQpG\)E W@W*7荵 .+P)倫>J+O ڟVkWw޺BrXG\pW XsA+BcWҊW=ĕR XM o+TLq*ͰWV,z{J}g`:-SZkl \W=h6O)\\N|UBF!\0ph W(Wx+TUq*p\}{蹁/T' Ky<+]܈ZY^M h jA;^V-19gۏ2|߿zti_Y/Gܟ_ś ̿CamXSֶ|;M M]__x&=_x1r8Y%Omi/^}GΚڹnl>jR;+Wlّ|;{ͥzA]ۚSy,-We9ܗs[a^MYx%NI/1S9Ǎ4&"J%6%ьG /M1 ObgF%VFT')sż~¥Zy4)\F7d5,/vx'kzi >]nx~~&Ai5drNAhediR.~J,7qJp+y/(5-^7T$p_^͓kW2?[x[^l_rvg`}-LEǚHW8uqI_n6 )(0}<+OE,MībtP]-M *7W~iQ*'ۭh!jh7.̥`y.8v1b:zN7D{%lӭP|u/oټ%/=Z3xѮ/(YEmW|} tu<,=adȕe fN g abF"ˇ(h%6mK[uNU?V/]OWn l^&OO~e-[ʀQm \]ͷ:?KF7(?B)`%i8ɨkUUGU6M̒4A5uUȌIf0eR! i49 %VPUz'̓7V=Ci m?x7Uh?BWz``uFPo+GW ÑVEn~l:Lw;+f%[*^ܻ{؝0F ?*s_enGnJ\kznG4 _h|6Vt%;|=Y1sС >NLןsY!{T̤FWlvh]RF[/0-XYKAd!R AX@FaSR ZK}9 *b2:*e䜋E`KYc՘6qRz &mmx7aͼMog8fHLg9>Ӷ[tMl Q!t0| ` k&SOiCI[J[؃H[0KW:KԶKZ$,a.lQZBoQYduJR%@)UtI0ujcb|ƣO7힩ir ("Ad>dRR:@g6ihMQ昌B`\ $\HRڠ17$Wl$h&㰴X&{ePX $ :`B9J-WX)y5RObzKzSLDSъY9%)`-DC%Ģ9Zn3L)T-Dk<@T6fj((AIb=aTa0+t[EA9Ku!,XPrZ`bTE;3&d5gG;_:MQK.& #;#̓e`6 @%HQ{'( CHo`sZ_WUwFTc]c B 06yM"b pQxZ͠V^yґ'-s `v ֺ>2jr%H{<vRM>MI^_=qؘ[ĝMź-׏YI\^0n6q[yO#]t%`e,rjEtG V4 IFeΐKM LޘcDaDegk1er(LE5>vA2K* l>1S% tȔ<:Zir<ͣy,r{Cfu֡ :/6ŏt#met H; ǹ:@k+*lV_M@eNtvXt. QFqJ3q=vaܜ|3&&8r*Ʉr~j!ySd0cGE>E&DNi, r!Gy.0%Je%Yy+qv\C 5o6?x)_~, %٬(ND/FaDc&vBLcH•:_nto@],]<0LYSAu$gEdφ jBϵ^Xq?aͬO..X_xuV.S6y/VRH9i!~2&Ra)Ur3GRi'?"rB餧 P]1U 6l+ ;lMh2jhHm iا8lTJ1I8k(FtK5D>1-+/(,?[F\s[GEZUx5"1OOFr#y ,K}oj frٟ\wSaKx~ ߟ#55T:&58%:٢@B\ٝ_Oޮ5|? b|ַ@}(bUt=W˾=e^PM5i&kTX liЪcCLy8(ڋՓ79L4Xk]4rS ֦'u1p^HXi4 _F<V BQn+԰Nאּpv$Y_}~?|ûRow?~^wiDX}EOWUEphyRk 1w{B;K?}?LL/,Ҡ}&n&A]uf~9[d:0ԑSs^S7 :S7 NHICFGY3O& 5Tc1޺ͷ,Rt.:(Lfdd P`Q3=J0^,6g]"g!O!e C+GR=z<;waS^5x\Kly̶\1GJx($eBUS= 22Je!,E(¤8HiB@3qv?oy$pM 9Z7 Wmid_<3~,2[=GpsTd7F܅H!$3;&fv].!1=rI$Ve?ˋ[彀QL{KeQW \ܞr:ri/9(권$QK-9+JLAfxGB3mu^zv ]!GWT7sCl-9b*rIe(=nt~ b xCzZ߆Nqp/_5|^ GwIPuni|g%f1N~y`E /d @p\.حJ;JJ*ˢTKOTQR!J --% 澣RoJ&+tq.ZpI"*=qÑ,;VQDɌ댊e2u3"ƠHJd 1%0l\Cll>L쬊xV5}.Lz0{pzeU\Wb<MGn4H06ucwI_óxIPN c?Gw.ާMmXlR76Go!u(2LBﷻpuMggşL 鴕%QT*xR ‡?Lr_KGJfӟV<{?T 'q+gBqQ7yau3I_1JX藑/p("}\KQaL'\T nj7PqO1B323Ub2<Ӄ|3߳$n4C[ًzam÷+MZZvFXE"Ek - $ nW |pVhQD+Y!EH)AՃ39V0$L`=^[4lDz0-@19х4QۢMpo| z zn{x7>ܘqW(4u,uߞ=|lxRm]i41O@K@H_XrKr88g׾"EYSPXYS<:hFV=-c u.9\Kg`<߶hldVUt~:\yJZS[o srZ[aOV8fS{9ˉʛ8b=+CQP3U AEI+Jf)oEXRg"P2zҾ*nǀPGm7 ,3r#c6q6#c>$ ΋m} dn?Ixl> \ZM A~m HITXh,?"8` X*jK`;>o^lE#nHʞZK(IItY$) Ӂi]*w^\lFl;XP;ifԦjw vs`P@ t8!h )cQ)ӪRy L+m$Pa&:80$ˁb9$` r#`|Vp+Oef<&fYgY+.f&!FGELXp^b\@)f( K*A#0 Ñh'`yclcW<pG@؆Mzx&߰!dcFQ@naY?f~dh;GNL:X<**3vi> П?Ũnb4 l!gCƩiOud(EǸ./دaPeb3yx@wUYn&. &cxjn ])3ohX>dH#\ikј6FB)oIDNDZUu2U5E!l&0z)b r ~n>-ϚͦkR/YТ7{Z4+3 {{,Iz)jh/8ww?>ä8,OigNxHU}Kwe,!]Znu OX^(FyQSIڱP fUxጔ{+ gEo>KH)aRKJ10YR ayi,IQI)%3ZX&/::crY6@zL!'m ɾJ"!("LjߢY&"<$]ޭr6;"6qt=Xm3N%"׆~f$4y'm}ɯJ/O7SOdmJsP0{tSlIJ&O@8*Oڽ %jï?BR̷'E>yTIDvvC.;(-9āFi%ޔ 6R82s r$^sJ`NH!wV JS#GTFZG$CyB}ކuvPu o_uk{!׊֤Kҕp9pNA b6X=.o~eo> 8o]׽j/#Ŭ慒d4ZS4SVK˳|CI}Xhu45g͟]_rNcn M.Ϋך[wݻ溗g^ncĹȯ?po}NmBmϠ-dﮜ?*6W?,nN^ۓiMfطWw#h%fP0vY4a#A2 <}KʮTv9s+OFa>mswaXL\`G𡶼;-oC'(}fl> S$SOe{d\ն)c[b۸F}s6_ tꉍV7ێi"J1]lef]tb]'֛'\y %qyå U 4&՞k:FY^1Tk%iõϸ/f{zOzq43=/ kmmwTRTo$5+TF*֮ft3G䈍C+kn#9/}h*ì&ߍ~A]?grp=4xdǦUrC5?ffqxs?}Ƿ[ }gIu7lSVfs.:f>km Z5-FvhZO.ږC{ҷN~~4iW7N^a]\ -~j~5EqiWZԄXMS9[^1Hė,!@Iki'sցƙh Qg$fS0\@+ MS\c; Fs؁֫8 5P:oc(dĠ#y PJN&ro4ͩ06qYwDKF:7Ή.w7nz4zi 0=&/fRƝyw.Gt$S2Nytu"(ןM+l={::> D[Lɰ',*J"PYbؔB10*ж;e 5|d/NF؎;$5wH+ Ykc e\nh uip }5uݘ%f-cW16^BO^8)Y{LZCx^`r}>g}L3-"1tTE@DQ,ySz>S34'ghNϰP[0"#+v&E:_Q2Yݣ4"z |S &ڨ(ܦ!e_0 C+b6 Ʊֳqb,||nntJ쿩žvaމχ=l7/ɺzHmS&{V͎ FuppQ胋&1].I ٣&|bJhGXR^"$`w:;)s^(z] Yb#X4RlB*ܡܬ;OkZ q4,?^~lz3zqEY>Sdc^I#]J"Ƀ|d*4xq_`70/Gsit85ޘ7eiC8^ʞխmB!NqL@s Z$`B!X"Q'aM-x%y|tP-A6@ۇ0^|*aG7Hx-vhmTP=i!y/8z޳/G 2# @v@^h(EeM$A1h{5F)cҾ֜-Aی]681L& kRzYwUՆR6a2,uu" UPu_EX|NdeYq ˦scT9:-qJf=O@իjgN *>hϢ$RڜuJB;*#~ YAm=ų/FNH,%ŢzuP*aXCF6ֳfQN/O}]A˧*w戍e`< K{"Z[?`WB"6Av Tߊ v_\ӪT*GEA>DT1U!8&y:9`бZ͠V\qR'=J=$zt:cLè56m{Ƥ`"I*t GqRIdI V.n:`%`,1I%"ZG 4$#@I*Be-`pۯ4\Uv4}kEлqu_ eķt[-1H( ag:ll@Bl˹SVt[)%co໠)KG FRGGG>YF8b,d*Sڡ *VڄRXS((J + t2u (>oR*OfC&}jWit=:z&tİRst45݌MDߑ5IwV uB,Bf|OHs24iI$- 锕%UK$0S}* R,["/O)DtŃKh %IEiE@J+Ici ug8˧ ~pY'o`&Lq6ӱ-@dHxgL fp f{%+4LI0۫/"@J 1eNj:FWQB$%Kk J6ЖFMi)Edk*󉱮Cr2&)JZAKAX i5 [ӨӸsQr?h^*6@IP$x#Eqɂ!XD@'N; )ixRpTTYS=bCL? ItރWu/ق>U.z>~fI蘅'ýX˜s $BN‚50BI5(v^W}yF(E(oc1L).K]Hc~8¶3^CCgJ/% O0_`\wbé&7-H\2$c] B:gHuu@yAh4m&KynoUe %8X&p)5A֬D.ov ͟/`9Se5ӓ|ȣ>#)gßI9Ǿ|ӫޏ?6־gC926yǐ>^l)P+l6YqNR=B ՃrᅥP*R6BNe]@ J\:vF.-#fYt,Dy 4Ix0GTC]m4DLI\D Dv'9.1d!h``$"BJzѪYw^bx{}bkUZzP9oza4K|_~ձw:b_T^RXfQHHI+Pk HP&Dy[InӦz܆nq_ !5cQRbȨK2]&ʁobJ􆧍Ṿ\l4=q$2Cku7K`/q}BH$ARN !%Jֈ2Za8a驮>"XS/hA*gnR 9WA3qS;@kg]D!]WC N.e<"L,l\?ڨAGA>$Ŭ8nH.VT<cVY/xݢޒdwFuɉ';ʧiP7_ {_L6ú. q:].B~<]I&0_"-泔|eg>2EắzgדqWtZ,.⚡=<ϵ(%'xNImֿvə}(aLMVZl==t__,k|WZ%3ro#~'?& zδ%ַE!ת#.իj-ޣ~۽\Ue㧛X>bqyi&n&AN60P-nxs"e~/a3_lt9|4:`ͥW;~Hۣ{ lM܋Y @.˝OQ7gwZ@7YS~pF?~n= Ѵt[Cr_yX7lیb[ՍQUɡmDg*Y^߱3kwHisdᜓ m~120+Ъ7lFaF\>j5xt&n׷$[ﶭm8]{9~b&v_J3J`i;#-f3%Δ.SV|ywZr|\vf J33D!ATJ )bT6DE,3M&WOB~~:S5x'o9WW`!:eYցI2[G`fq Fꁱ>Y7CƵev%_:_ >dr!K'Up B.Q~qeGZٿjefdqg(`6Rd-rݢOBDO,+*MSNO:=nxQpOUm(7Q9Xd5g A%z\t\k>_Nw=Y}DX(2~Y;K +2s2J.Cpaȵ&B$Gu=ϖ+|)(`[kS(`M7 V ԻuQ~$JZqj]/pljO`g aV1 ΜuN4ҥb|N* Va`=ȖՑ"Âg̓)H`y# 8!&\tƈhcƲ=$L>gEXf.w +L.Kw: 7n3s)X'D{9>;HX"AZ8E\75"rxoLdsݶkrV" P\"5R"U@!\'a66fzYo Z I3xT6COXݡ w7#H`亯zɽškH2u-Qyt]fJR$ NFVH2$fi"Y2M BҨ{B)25%վNEr4SqA$-Eyqs|HFfGv\]I+8 o1W\/*35][j|Nj#76ga̖$O4 rK 1Uٔ`V9-gQ]., PŞO!)*lJ2T p0ڎYWrY9*1mt+s#vҎ|*;Em Y !f-Q &Kz*rWMY \S<,213 E &eIda#8Fm" E{Q}RZ b38UcDTGD<"]OPH]X-_[ >ے:h LNN3CFx#VBd(gB \2YbHo"Ds=j4KƫI>\,\0Vy&9+tĂ P>yŋ%0->#.>.>aŲ/^m❕i{7X^u-\݆j?nz1~ٽ9fj[+CvҾp/S$S轪TL]q},KO4*ء+"-cW z/w%!WRvׯvy)Mӫ>֭otqOR8it-iy*#wj>xn6Oxm V ߦo;u-YsG (!.$WV=0׳2OqyA~܎?yh>Lf65.F= .{lK)-jS&_:Hy#BsKCM66t[wcJa2jp7j/W]Oݚ3yLSU `l Xh,]#˥-%0!.'5GPI!7AwCB `!S ha[+A$ gE`(NERJG""tk@AGEGia;̺ r `P* cjcieU"@ٸ"0ȶ 3IW8Gb%'j Cz!URXx L.1#dn "eu81sJ@n,2r#S3Rc Y|r e<(dҷZ2)I1Op2wϬq~=(Ws΋8 o:k /5s%fA56\ fx X8 dС/jЃ bӈ0#(s#njTgC1IXU]|V[՘#p٤1%DCdAS@$Diui(?@ujoW$4Neg,^|(k)K"\Z4;Rt}θjQ|ōxԏS2fy4g\2bΠEឯfF O߷~%FFoavCzZ&"M!! D氽wԟ?t7~<ؕ=x9k%Vī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWKZ~Zg[,7ZU=k}e~'g*yގ1ǣ4;]}JzmwgA=M8ER"ϛvg(e7T7}5|ٮ1 V. "2\U{figQlAKuxs*[?* ڽYzuBݦ]kY\Jp_?oҕj& h3rQ@%miz\f|y~~pViry{QO_7DܪK>9:mLr\zw.bm~O~o/+(ߒ_ 2 nz񦯡_~6[xrw<|hv=7s)_g}0?[ӋYI2m7MmFVOpszV:܅E_3^}6P}yI1i|k^XwUf[̙?w,jlkw4{j_YWt0\pn^zH)T:|9Ld'x伬@!{QTMJm~gu-vک|u]mHh#Yě0a(FY]IR=Q0*k:D1\g_;,|--5V'\{9PU&tޛ >x1jƣ.aKɘg_zdZGzה[$]6z-C/CO.q{}JrhIzWM"mmwnv1o-'u&C;?[ܽxLxvǠ{vh磸6qg;>1_u~}!ouu?Użd\V+s('nVn8.\=oU{מaۯً6OK{O . ZuŞg˘lP#A9N2;46r#3s=YCU袧Ϻpy)Nx=[TNXI,*Ox1ݗ@'Dz'6jsUj0+hUU)h^ScȬ*C(JnPz }t#}nͼ8 ޓ0gOM5ٳh H>Fÿ Q` 6еϗn>qRA9;YT .ڪqMSJzmR^[¾m(cڴnrJXS-KUq#LW !Q92`88ƐZ_RhbNO/K!n>vn;l#r.g{0?e7uPWwZpb< O`.US!\ 6C?evZ*f$7!M5'Z]Ĺۂoͳ>;no.͵Wk_iovLEߍ+Y>To 535IjOtFE)]|•KҚ,!';೅wՔO{W?/w@ēAq螓K;ƥ.KEZwZ)_!8{TpoBe/ 0h"^qQ-S2Ld20lpy=ݓru~Pj:} -Ag00E-|,XgatBN:Kcrh<]* ~)ðœde7hͤN퀱zOrIQq# CB(B/tlW±Ĺ母,H*4rAwh\7.`ܝW5b6A` ėQ;PBtD4ɇoULrv.?'ϷKwGZP\t)[ cuIʲtζ('$ B7A=fqŷRU8C;v `5zSRYDkT)%l*^\;aP\D襤wimbdz9Dl0qޮ ̬W}˟–1#9()76L`HQয়YU*ih kƙ!Ibۡ6L3RR4?'D8imO9>4CXbn}EYGc!l*0t,+L \dÙ:[z 1<:!MY )n[uߨ_zUu>h(*6ybӺ^lrЫkg` Sd18UUjtJH((W+]2}]Jzl6M#gQϫ2K#d]0*XR;7(nBy|s·BfžП=ݱ,WF { ʲX NGevcbfk&j3Ͻ"= }[MѢXƗd<QюE劏`Ty)siyَ*< j Yq(4}tPa%+x9MAUNETkd`H[r1-DI,Z(G4!=hsfD; 8Ȯmfc>%崪ʢeH{[#oZ)=r4w\=~[(uy^lvjРe~4 3p4-L <џr,ԟuR3;i-Օ~r/_o'Mk~WJ^xz9:^(ǂ2"|:r[C6ϿjI81]q*$ϮTs<͖ީ;nOnFtZ]yU^x>^,h|q?]pV+̦eɅ);3к8Z;G\7 Y; B0ŰQ0i&/DO_^V5WJQ\7wklHկ4q8vѳ ȯ%1>ٰJw!ʇ2ća?ɋW/ӟWg_;DٻWAf`(u$,$l95O'orܷ14 Mbh|z·Wk]>Zcmuk@fofE(|$VdK 렟( vNZ)-E$=z 9W$P3̴)% ¶ŔZ+Sgm<`]MLc1>*H1Wm7;jqgs0ˁƶ~~,q11#Vİ1pJE1Ρ^mrHsNiʘcA>HH1ҩ0QvѪH @VѸtjUDj `:GQ2AzX[8}L*w.@\IEČ4D!`JTpxu P15C!'a 5?^ ܁O$VEiDrl]4 6WM:f6([8S?tMJ= 6%'eldx-<U:z"tK}1"2HrF_<2SpKlda5Aÿ +C@Ļ[уyڻqA;[bk)!S*%uog ~ |)mRz(tl % ĿeQ-&sݛ|WkX ?k_PUuniKr˜Νr? אlXPwn* A*!9TiPvML|)hzf"XȀ9j<ބa>^o Cq?{45qx+^_8KcR! oU/R4\rl?o5ҮIk澅VY J+ f*YF٨ ߙŤQ= ;y:Q糪zr{34G({ l9*v_<jB.U9K"a29Bio/OŪjQr-c1sQOUUTۧ=fnc -rsW78Ŝ92[ ΉuX^#U~V^V0#)6gF\s6;˜Cl\2$W~_ö 3hs}axPшpJGІH"Q548fY4Y'"JfDTLiM{&CD$T"jp8%b"XZYh6ȹdxx`|j+6N8Qe\(cs\Ͽ}cN?y[^sec #a!#c hLA1a)k [1o+[{YMzz/rGQ' +v .yW'D>y7Ӱ$Q+1,JVF Ja VN҃uW+M[t L@;9u$-< ZgPJNJhMCP`ZEDŽK*`BXYL^jʈh"h#2Pu5r[r0q?)7eǠo7# ѴosJc?u%;nק%_>N"_h/D-ISB]jIȈ$@9ȑK!ɥ : !zw O p4 I59H> A[mH89J=1-f; Α֊#v`ƥ4W@nVMdqVccjj^8V𤒨R%#u@l+CMpI,!JMABN#xDZIoMh>2tP qi,'d`/E'yZ< 7HFM'n '_=G2yCN<5ZX Z(Dtx=?%.C5<-h8$0tQEm[v}{7pc{^jC'oOR˭7.e&5OmxLS50m0{]QO%*ms(O0ߥo1Bⅱ^Y"\`:P >lv) 2*o1޲o1xwV>{Y]<`_? ߘQҌ~ WVمGlq\Gz,z{=~a3baøW *{pӒ3ʜ!sON ӡ´ԍ K$~2\ꏖ`yď1F{f?M$)lj_WGin3C?h~ٿ/8~\l6ٳ/&tQ+xR@‡bqb5# /S̚?x[sÇ/zoATN)8B\ SP\d v,&q 3Q* 0(~lT>N1R)ҪJ(vYJ¥߲ɬqVWh_ª/Ԅ4\3:2 gϮ\[ʒn745Y?ٰ(c| }9} tXKyrT}sVDav_(&Y=!Y/#G}/'%?ޢSj.6<]\e]^6Nׯ&XӐYX PLBfeڨMTPj?B }\WpG9.KWtǃ=0Rz_3Ή.Lp۾<_ 7%*r+rWuvԡ ס-)vi)Ts h*d 91r#R@K}–C4i+IRA4i 1J_U)۶j.Jίcemdf&c.fM ? LV^ezf-gSX J3aoK4 7MGUS6(nY] d`Y$[D[|J, eMszzKV̢fN}޷7x佚|k^ݓLU$ԞE1/,t{M 0jڽZ$%inA}LP}{^eᎇ=~;歟֤&jR>"yRˆm=`{v ~DDQ~1KB~Ǜue-qGgր!qaɀ7$"()PSVb醦 *hpotr` m)JH@<|]<1.XA"Rpn%L`_c`G–H$jô#t ^[Jrןbb-يɌn|EE`]V+)`:S|O 9CѶ# D@'Q3)0ZFٻ"U2 Dmn3r6tXU;{m{]ϯTg2$MTLy%70: .H/Km~Ңjo,'2g#LL# ehi96#aAqe1RvI0kE]9am'۞'~~"U%x*r|}G3ON(y! GO' ^߽bZj凷HdY(Y[e0zDJaV(Ekقr:jjK"*ҩjF\bEEI`îQMHGȁ2lUAJCa%Z36#a6i[Bmu|.|,e"5;"g`IM/.~=;Lps*u4L\,P+UFyP!Qe^ T%فgٷ hOے9k8.hfܱVk=4Hb}0YK2T!#0xJ J_:̎љGvbML #hJ%EuM&,` YG,yec}،a/NJb،?ՈF{׈qe/e/Vn@X2Y&' Ku]80~j\p3ih%}N4gb%++fI=iN'umkflТV dK{͸d[8^/⽍OE;uJbRފ#:eT0@lz1fܱ>񕏠(1+Rw_1 Q[F? Ձ u踆 : :of ?o!#F0GNsZ'_HbwweW:& lZo2'T^72fa̵`׊"??c8켧Ȫ;Xt,XHxaBf 2U%vKZ|7zgoSkB̷ (u Q@1| &+R9@D%c +)\tɢsZK^l\KHJ ^0b`?g:X${p2>thFCw&wMOaؾ|MC%4ulSvn|W%L9ST,.n*lժ}\=P5ˁ6bl'DT cfGcH#(¼}*iJZt.v/$w&C}t&:ySQţ,Pb mik!(mdF&&E$P6%(uEвD𪱜5#gC9/@[0jGQy22O[^x]"'j) RȊ78c^Z/_ z_'R5;#*~IcцY) U1ze]l9iA4=I{?i)ad5*3& xoBЬLJΠ-i K)xgةxbyhHx^n&~oblzFPgтͨ,b-fm켱3G;"TRWNޙ\f,·sΘ!lrG!+5QM$F/ ڋG;XЌCh,7A/'հp>o[}v>ky\65k/K=ZP4Jln % c@kNTFNH $qҌG @4v.殾, s_1L-t6jQF `^z@h\-$=O#4a`(A4ȁN5!hBjƤ $J;^8"YGJKcd/r뜎flXo维,_pkdp PbDA`DmA,!%Q?F@2$o:c~K.z/Әj9[k`UvoOR]krA2UqT{b=a`gw<7*8MQQ#]_$^*')$ @XDu)!YؿfYMS술%s4*OQ&[MIdai.UV~k.)KBѵ.kf 9EFwuO0fOxk`vFQ\Y/ ÖӳSJ88fRR0y<:LY Й>A3g Ij`pk[AN.W U.tg?.WQȔa0_ßjpzE2_%^wͯ:iJek]R=hq3nیc]7ӓnx|~;*76Ն]VE>]-zr=S` YnuϪr2=崐:az4B/'ʠ92o~P-2O{6pvyN/No8߿_# xo~=[? L)a8XGܖ?9kpg|n [LPo0-z̋-yͼ+-vОmH.{??Zjq ڟCk^i]U] /~f1?zCT}}*;OJW!b; n_(>ƽ>ҧhbt}^~Qҏ2C+wY r=Uyk-U1R+* 5S8,t,iceCaЉ>j !Pbedt`Jl!{&;;܌'mw^|ҭK+p>ybXlߖsRbZ&]8uwφA$SvՋ N"tuM"jɹ"Vkƛ[]m:lկr KX!cƱ%,l.Z3*0eczpж$e 5Bb+LY0Qgef "(="1:qnތm-Kq;h}nœy1}n}J%66=Ym癠C:a+7:=F@ρGn:5A*"!;BEQt= zɷ ɥ۝[!VژboLx6Ncu2"}(ՠL$Dc۾!lV4803J M Hbc1b3r6ǐ跫]nC܇_D}ayƇOOV'㝧܅잹{j7iU$!}šPlar}&RP62֤1 IhID JpbpS"-y)IM |$ctJ^IuP̅ j,yXW/@ǣѴ;\^{[/jba_,dzv=˾̋.O=nIB'M;mw(ձ¤#y={F_+)3}W}Y_lb'DtN u_Xܹzv}hὔ314lh1NOEt`-"WW `hlf6`nǫ5 gw]grauU7p WX ŬSݰA3"(i'LՎ`mU1QmI;UTq@Pww`H(/LJRVh}"E Chc۽pQj G)pՠ@ 㝏˔(Ql{y}-..FgjGk FA0 $2'1KȈ ϟC*b݅ $)KJ j4YSbLY8+c%,#6(tmfg6# 6V )lhM$qLRU %JSdBA~SR\k5Rݶ&Ϫ7gR\kJ_eldlL={Y{ &@[՛Ag4 3:cE |@VpAxA2#0gc0Z\?^Y2>ׂF1Pm{ Lce“AiO)z`brҩD'c]fl~~ ۡ@R>ݓ7vo?ZxGH̿gJ|83Q,YanRojQ9 g{ɍWi_!208vIp @6Idݯؒlɶ([mIMdUWzRgXhfL&972VG2x*!BI kNMSHJ0rZĈ'iN$^6Ģ$LZ{-v1ZOu42QǸ84<҂$O!bgr$]qy[~2 `v'pNMk]1>^5D/׃vho-oU`S@q혢mkE_P)ow3Ul'd Yy@VSM~>Ris1jG*F BE*%9wLbe 'SX1z(kuz;/%*P^jEQ\? ^hɋfݪnPQWuNEMo讜Bg[ca0~=R\ (KB(Q-FnI]Tu*L({j$HE%J sZko P"LDM(ż"(MʦV1HLt "2DDy &Mrw烱n쨰n'{lj$`\._<(UK<k$:HLt(k,-[d`9 *Z*D<2b\pJ'Sh_ڋp3jw'oKQ%ubFژ**F'{*Z(8P')_kBPY&o[o[k_S6 ' ^߿bu%vs]$h $INpc暂< $I; ɊlZU !9G 8e(zP5ZcƠSbTrKPHņC~1RL=e!ee!UUތ(?*#50=oeOAk~Xb[cʣ\:0 /Ą.A4Dz2kJ#g@˧,DΐIj?I< ´Zl#ş$$O=}5?<~Nsp,NxPA?!XBH6"F GS<\rlύ$!\dP IQ8DJx*Sl8۱3JvEBR@?:/C 9&m½穑a׺<DM >*|&)5h^C^ p{-"s3 !'xhQDW޹6Z$F ḥA^TZ(kٺ.]:8ewObbZA^Bգ]֙Www7@jclnf _]C']l,Cc4Sl(CL{gMmNw~6[ayJs_s~a3 $!%6[-aѕܰ R)(9ZY7~^+bQfHBuVRF ^D|œ%M'mOOm_ykxД$"f9,bY``Nq89' U-¼sxw̟&8i%+(vEyVyndJ1D,AKhQV[PqTU:S1!? v8>MH~/jʫ@Ld Y {4asNN$Z5ZcRkcJ6Wcuی@'HB m/:Կ>l w QPrƽ`YP$ԛ*(IBr7(L 5(!򜌫`.c^!90$S8cYLyi!$:G *q1TCZCW%% jx{Nv`qkFW ^Z>'<\*qE!|JEA[nS+ :ƴH!F9"U,z&7uoԽ)9cTs ?Oym:?]$fo3<^_sޣGh~?n"QkZZ2@s p^jj͝~ʾ(kȚKy _?+n> ՚g7Eh_?>@̇Y56]ӳ Czs< .jԹPʯ)fX|X[JAsu Ix2E@\3of|[v>j>&e@zEdCD#; f,20I3S+b}{Н/}ǟ֑ovqdZ_|p8S:! b~mqRօMiB{5݉=N 3AC2vÈ*rݭcdKm {si=ox!͓.'y%aMg&Ƽ0eE*v~pJ?aţieIσ(a}-8_QA6He*yŤL)dv2dy):9 NF^-1E үsY@v|h]@ pR"*Mt;E :W"^4E O&f)ط>pRą'/^pj;RoO1 wP?(J-R\/R gD.œR,k ,H/!0:z.|pRxFLSB4x`Of<0Z{98ׇߓ /߳l\3|]~*?Ӷ gBgz ϧLL+lErbM9r=(#T{o(Ih#|$ І[1oSBPxeR$v]9C"e GfT(MjI,l,6AKr_iMv}f||޽re6oqhwҡO-\zXl#/n|cᓓ]Qy_UNs3R1IrH; $r] ؐ|JoS }U_*!|ş|Zzj5;Pjh˛mEZ%숴Ki]M{8AxnL 7 ]\w ~nx(:a`/AU\J<|;gvQg6k&!"uWݞpuknѻGx Lϯ[i;j3[X5Lfo|ff#w \1w K((,뎇(|{nrŊ=mJ.*m6_.5ZdE*8.VyV@]CVyY{,In4YY"0+8kX[SEF7A6IHj$UݝefU^6].S#[!{3KNp>}~ZZQ8@1.T.B-'cb_vv/-r$@G\fڵƚ|h(z]{Rbc!HM.iZ`\%T֓F 4 KpT 81;~`8tL[Őӳ_?@7o}wf;9m@޺'t3r.-٪Po3O>/JH!̥jLgű2fOʎI!eBR9B5'pX N^q.oWJNPxN_cQN橬RNFlPU&465x- e.ձ/VƠ]AbE+FfLxI?Inzd$m}WK%YbzS& .x$Z?[hwPzzط7^x)h7NlX9gb$mN *:{4A(J!xr9D P̚aCzKKSa#y HՃSqyop!Qv :m/1mWKS6ǃ㤐jݸţ粙W&ShNy*%E z P 3bȱR ,c(7hqjJ&k4d핲uXRZ܃M~\lWd'bpm:/*3DˆNv׮|Gۂn_~wϷC_z>/ԜBP>4>`,-qd%g[Cŭ"އ@Iqk]љQs6 փhg+i+>њ ZMsL;EFI0X! xP"E9VrʈUGW@&%4JΘXllg^Y/'oT9 ycE< B r@WH}IFQ`GR>|-i,:rt&'l Q>$)=& p L`P+ j$.6f{A15Zvgŵ))xˣpպ D&n2\o6'O7wG۸H-K H$U$*9@r:4ȅqYi7!se ]Cc/7|j`Sʷ U)]X3 !lE*B  Z:砝FNLѨé LBvX;[ ^̛cMhQrJן7WկҥKPz P c3C5zxJbpUukEW:PX\I*Ac`{Rr*y|(;W\u^`R HŢNX4-BâRb+Ô!4EP\Xqd1[jШtlYbUl"rBKijPEBu ދK |*d|πU2ܣsek.tbȹ嶗k"+-uhub:9ygF P q+WbrW{}[feoavvrt_k+GG jJ^U%p\ETj^("蚞5mGd >'C.^TZC|![r1͕D&m3nTaiP9MdA8Qе16)iǺK)$b#y˨;I\4ϩKꗵ/5ӣ1דY[G"ocrVĒLX/c?1 Ӑhߋ_yayTѿO/MM@,WY;[gz~ EpV40x`~.F',h|T! x&k[3]ZM72MopoY[gדKɬ|#T˧i S'w-ыǧ?!mQdFGid?5_ЌC --&w} y]>cvVW ד.~n<2g=Zk +6,OGdڢ=[nty~V?ܞ{qM ؔeOa6|cɑnœnF$ct6kH%G+cAaЌ\񦪈$G8t0P,6-v[}]VT-d|z -njͯ8ݻN: 2;jH-1: Ē5*0Y=}YWcz#,xN 6x10b^SbS%Jɟult,JF LVA[`|p j/ei:oб}YO:@4l"%5p ]4bI^%3Sr$(ˀ(rq9XcNkr4ƓqNLsMSԪuvk袢Phd8^ټƢ'(Ļ >FLӠ[l!(< ogUbpuϗsnv{NlYz*4Ae%ײLţ<ЁpmPnB+2|\zw)/&Ղ |І "=ōl*vÍ{;lXȮ[d䫋X&mvh]M?ϥ_z0iyȽ7Żl6u{Pܿπ%mc@nXӺaǣa$[ 5xQ뮹;] R9- ՍroTIlvN@]$ QV*&?uJ,UUj beO@6s !L%kju0.FU-( ^b j&40T`a Z4tjxvg*1V5.hn#<0A8(o<.4j+h" s(>(6A#;^Aw mirOh ,tۑ<!oDڻkKQK LFйYH2)+G|wh'  (/qcʏTE=ᩭ_P 13%F QE2c xgRܠ5FhrI`ϪK%b.R41X\~`t:L3BQiwmI_v~? $E#AG5E8"UHeSH$rZ=UA31 |܏7kq9$(o%p]UZ>iON/=7:C)C.qܥO3Z%pF"xr;Y =|ic%A| k`C( PKZbM rK7#FQNPa'V??6ؼzA묎pᗙ{OSBxߡp mtڋՋ܂5ւ][|"2`8wIR}7gZ3eEYG5޽{I7'*HjL[9狮_g4#=~fϪɰj=0VdU[ɰu<9J{)U?6dzuJڋp3" M3Fz!R:v WE&Ѳ:7.on|n}\d⭾f5b\=EU "zLZk55FD+sv@R7@E Qo IJQ/z,Qۚ\8ᑽxp}+V׭wsqp&Br.9*MgW٥d%Cv+|[mmnt\!*o'W$ߨlhdh}yL4Lb$9o;KpE]/Nx#:]sp8S:  0ِ?vjMsL2)K6kSO[紨Yfa.x stc7h7n tAq ׻ xJ(̓AYyT! wc>"}waJ?bg)knYQ$JX8: ξX}(#BBP8A3RJ*y g!5E$w"<̗9T Wn X* 8] k1:a&R6 ^}lu%QasH%) ť'x0sCR8!A{7\ /$ϵ* Q̓+c.j1qzP5~1YϢhRNCTTqG.!HFby*MJQƞXhY,aSF,',En&Lş5]?5/6LkFqEd.jӐ"$"At\`CIeԹI2deETgeJ%Q3Dݎ J)] tFl?PPuڼ0j; v3F/Κd 0@,6s&jάN\Qh |rF:@ny?~.Gq:#'7oUXnpp~QVjB}Q/#/&<{mqdf'ڙ3eby1gtDŽq3ˍH(aj!{ռFi38GV9Ψ*аhE+TCd;/~V~lIrx̰^y]"67l*l+SzPw p:q<<?{݋oWBDjι.:¸B?jg5vQ; ,%mP+M-$'@ \+/yvF,{re`j""emˆ:g6jIjI0jfKa´-2F>lɉ{)f:N|.N6N$MG_qg (QJyx蔓N[[=QiGzżh1(,yj8|@OfMoTI:0k?0k_' s.c Ԛ{ 81Ym#c(HP!RÅA*[]*J@ArYHK^lNf iR^89_(tZl:["LlMOr~K851/yPuWtQm Ǐ"ߥہL3\ yyGP\v& tFqd܄J(YəQ$fFu[s!q([M\h<rHJDX둡(9K w# Ȳ'#Y̙"!5"'`B`I3hNt䳖٢Bͅ?/m4X3+,=Ҵe 3kv !"!{M(Brx=v,P>|)?-p3ӿsL(Q!2yPJGǂ"@TDbV;=ӓ{麸U^k(5rk&G9`ҨD2Tղ"qPŏ^tQPxU=z?a xc|K 4)(.#" MH1<*Pѣtt u:}0ȾG 3"6R, >h")Ո:j .( H6:a,LZyc)Tb/wF#kpAؤn,|tm_+HxCBԷ)Է S/Ƿwͱ0>JzgHH e` @PI y8yWb:~X$TmbqL &fu\,Da2ȅƅbC՝*-sYΖhzH&V^dϏlˍݮC y(:@$AFzZnӼu5kk5S<{}tr1pvd,#| Rw~1[[nv@6i|p ۇ AȺ$Y;tnvV-2b(\$Wn]&^3vTF6xu{Wι&'ԃ% mHAJU#Z gnX۩:UHfU.ˏoϾݫ~{_^}3<{[֥zK {\Zwx\k1;A;skg@Bo~?F4TWnףl̚t&.Mf~6L8E׫J3sY CF6mcvoaiyUclHăt|J޸NZF"hA+]eʑvJ㫫4 ˾Zєў`O}^h̜ϙ ihoНIg3oM$ ߊx+ Zϲ,RSӠԾ͐S[ I=9OR!:+F_.L%.%oqP̭+,L2 CdAQk)0H0h*]eC@IHEXt;2C֝='ޅ|29|SfrщlK=Yn,]>,z_'Hg`c~rCTMcUjmky`Ǔx7^{[zM1/vYWeIZ?^$K+'RweO[{tיMσbM s8m~^oJ~ˢ'&ZVա7N[ vӫ,[A̾MkS8௣B8}Or4Ub M4[$80+PXaR8`QIt$ 0#AO/jHv f% IbbՔ>7W'䇙LPL -BbNpm/;A cSe꠾$?]'"Ą$L%1eh>] gd.,0#Lq .jGp^B,qKɎk)᜴*iч H &=وv̂$fF2XGխawCP1sP8fJ`%2+FJ!ΑOeqު u˃Os *(28KMɥ":l4!(ssNeMocGdc4 h3(p ۺ\/>r෷ 8'ԥ[ܬj4]k[?֭7ő/> F-f<ڜwoft>Nr?~̽a.3Lw:ŝ]BeOU,׈40N%uL>kJJFpCp p;R?[5FgˢGRÎ<;j; 2~>/}sr5+ g\t&V5O@n5!!:+qi(EeN[&fhc)CZ;Vo_~ e]3‘Q_"y9* U#8d|9jW&w\%[Z)mD] ֢# b&9!{6I+W_Tl~n6^s?$]2@jLJ) sJr#HXA)u\`UާEYu'tyvۉApe5mڦ%M /gu߶YqΥ0\<3!@eL1 Y$Pz`x>`dJ`➆J $뭃hK$RusrS  #F&FgJLz9"a " ! ɐ#p —n{0ٵ;qOϳpfJ7_ 0wOϜnw=}ƟԼL.=wp951Mϟˏ&th~櫯kM2m@Z"BO`x|s1)FJ_ϊ/ʆ&p觟Gߩ\^;8b$bi_h&WG2xVMu_DZYF7])݇VM<ԛ{y?ڜX0-cWS"aMHok/ԝki,nrH-:uG~b;Norj_vXIo"[r#rޝ##6+GtE]α_t yMqdN7?H/Dm4ELt?7۴n> [zgfsntw[G->qT,lx@Vu'6q?]L 0iw"j˅~q;u[VD~œ[ΜDOOϑGGE2w3ie I r.j2+6c" I0ۓի~iWP3NN Ap%&<@ m.YxBQ8.g Znsֵ(plus77^t^׮/]5GfXu7Xf3Ӳ`M+Sm֭RNoS>6 x<VKmw9m#"7 dtxOH)c fAV7IɥUd۔ľcphM"GwV86z~4׭֝=V>" 0o?]Ueȁ+/H4[r S)yb9c ˎW.$+`, ې`xCĀPQrU0 nCM&]b$I`"xWDzZmj+38|<% W׷;\8=NLt97H_r]MT޽\XT6Z(V`41 RkcxiGaJT܈5Q1f*W"bzHhRĜ3)j\I j#cQWmRkdb! < vQnavw4,jbǟ/̈ ƈZh/#hŔEF6e2٣p.0rDu"Kz EH{P`ShZF:Hڎ mV2VFjٍ~<+XP8Ee=j vs/` HCgSb`4,ZgYL 1$rI= "šXА&HE_8dyee<֝x؉NicAjPDQ#I,Tb5jNȃa@Neٝm `HIImֆh$3>@\D' g 4H#2v Zٍ] cZ{WǑB=rwWUd '}Ip0̈"ue;ӳKrIqKr(RʂHgzټ侸8$Fx[wW!7m HfS?X%o c-)1X䀋g⡝ﹾ ldžMJEL !({,;~|ՏD~,'v}㾱G9~ټ샲rմCjbóyͻpͣg1I7qEՑ3Q\ (},9Bn-S;6OE5w m,dN:x3LL ԫ6c=;+?ש-뭆aoo>[qOD1#lqg 89j&U=Y r 7-ǓKw>c}K}X>a34OeQN1Zgũ@-tԍRulSN!|97oK]&&&)]ϜzwDh/rCS]v/ԓ9#ԱrsЯ'oON?L 7mY~_9=`o<-2e JvP[l=P4&Pz"J (ՕAW_#@j&`HC"q)*5WanT>r n66DfR䊤$P͜ 9e|qTً<8q}b7ˍj7.?QVqYJ |6 Ql8DK(6O)Yӏ^>,ao_Co7wrx8iϷik0Ns^Q:7ۃy"}hG) tUOUs80$IahF7\uԌ'˭ڤ/e'LKgWx9])¥"['jT<PQ+:ȓT IaNlOY٩ㄙϥmnG?PzNogj13/T3oȮx&O>y)[s};"UV-YU E˱IV*s߅{0a͍|BݟNeMqM=4vi䭦Nz֭Nv֯^ ˭s>?&_=έOٷ۷{?ŻoQJt;o]y?,OOo>dWss7.;=^>;gFGvOx OrwJ0yNɲ M D!6J6UM\u_fla1afI ;F?_d{`Z32-Q1-Q/:mi}8;:ՏU<)A@c!~N#k'>w=1̟1X6O>JSϵ>]d26$M߯xy|I D_s0MJ)ߍx$w5,Vo޿o1Ƭj+"Yp~E?_ 1-9束Fo,_|({CtǍ y} 4l0}pf)d YI=Y~Oo#}19> I}(A~b|_@-^VF!ԦN1fdؠn].tA`|, ]l|adA^~-Z_dT _>4(#o/bbeog}Xևkgߟ0k#[{%%j1 :csbEyueljxa^sK>],ay }w׆>(./ys^گY|<:>-32ej:bt&,E~~WgaS+rqxx͟ndO& ?~DΕl-3o ^[G*0nȪȘtgG\Zy{\薉8|@$]uf b >)h7#ߪJ]LǞ%NduҬfR?Y=hqce}׬]ֿ_.k%o?m ht9۹iQgn%٩781yG&Q `"5[u%`RL}%UBr]c&R(6#F\L\4…Essؘ0vOuChhusg?6 7Z i+o]):r>}I.—Z%Z9.IZ(IIujUbEpX˄b}_ c5I}C)JYUZN1*늳ã5z&QJnE7P&RPa"&0YShDP,yrh.h0Fں{xtp &uz8z;6Y-mj ^!E>%Vw Ĺ.Vk9iQS55ks"8*RI`UU ׂN(btsaZz7[c%G8:Aҏkc1 -8emi)ՐRPU%CB}"jK566P*_E܉lpn[&K ]V3I|7k.ڽ;g'q'sUQOYsdXk#xh7Hcc.)t49ЋYuD!r9X.h 7 G`i4P PޘQ{pcYBm.F(ZNR;0] | ,ٕ;e,w%"F,,58v-@o0H/ʀGdMlc:L?Ye A`?A.}{vPDlhZX(ga'T|;]5,j )bl9,`k.;`bY{Y;A52 h2˱oPʥpҧފ"IcA-#*w+(aQ4lH%/$Og .0L!)HOm0ZM:YwnFP[?6Ӵӓz+(}! 8tqu&@3Gr n#s00MIf1hTtk̦ϵFi9BGբGCom2eS^GhH|YiFhф}@!(!A/k Đ 9P/Z"ѬA,c;ed*aH` H&,-=<0rcoެŰt4Y _.7؊bDRu 4ՀU;))"w^^0 dZsL}˜.24Fĩ@Gu127g2f?֠\SPF% c48'9ܘ%ʈܤxHƚ5Ui L>{mj3NG6 !W(RjNOB?)E{֒F[iK>&E%V4N^up@i* f R5rT -$E=`e4}^H tzRC%qU{HO((q˾ %..I6[2x?\( uY}62R\C7_9edǤ`&c[A$`iĢ:hI޹q$ !c!89]lqņOkr(9OQDR# b3_5JT|SWv ND4hoQ; L5zD׃<\pT B4N6VXh2&e)gm7o~^M[ Չ֊) T~evJB&[y  `xX$4!ҶD3!g=a*:;|8Oc uư'L!Zw1孜ǣ~~P~J h4z"9^?ďÃX}LP&; \0`x0 6uIv`3!FVR##IF}Yb`*2u0kKh v,s%ϪNTZ&x[K 2{Tס uhҜEzWpz[3ʂ ,E/?}I5YB_^P-{j.í'}gGޅFu%%RG{G{G{G{G{G{G{G{G{G{G{>zmN Jiu^ @CJWax56WlI0l/1D[de@B/p]Fǎ#s<vR&`cwdIAWm8-V8"ۏ [ԭ F#u2j2R'7n+OmbJhsփeA;c;uRy4Rc IͰprip8fi%қi5A>Ҙed\8vMcJu[bxVbJtˣXYdDRcu#9-cQ9,sc!1e*lhܡ?j!-^=ۼ>oO׃F87>^\s^J c=̹bFhB/o_.WDib:GL1DȜƱX{^!?ߞܚ7/sz-Vl9LyC?h|(yxn=`G&l+~ifѬ];X8»LYCϏѦI/Y;ڱRR8v%ǾF(Xr&C理ShKi?i4hݞǻV(ݦ((HEU/Γ*X.jJ֊JUV(SZ`tcʟGCeWe,M~]m*7uxӑkVJ>U^ʮ#GhۣE {Oߖp=RoNBOw7J9^FU¥ȢWNOs, (Jb LRRaPeޙH5aBRL^.jo4Ecz)˴D{EJWur;ô~L4 2Eeynsӫi|v2iu~,^f募|~xޡ]sϒ|ҵ{tNNn* `nyQ=ٲT/79kW;~3 z՞(|f.Hl4^ ? Fh ̫)ϔHRLf/ `7؀y@ku"7Nzs"^&.&ks~TDDkғUd?򼻠_fɠXha_ 'Ղ?'8J7<+D7w~TݑDX;wantӁ-J 3N9S k\<*)c4P=,@v+XܩH9/2" x$YJJ5Nͽ.jvhu֜-6<Ғ O~Na5;\nl-ʊS@"^I ܊rS/7JLĚ#J53坞SvQYsvv各YmAR.B0dE8ᎂ%ٗRQeܛY$4Î)@&+W:\jO5ASgRVZA K4:1>޸.3&h|(&ȡ?:3] =Q`ͺAiW~)I/p>K5HAE]Ru_&eUu0q;I[Fv ):Pg7vauEZnq)pu#D^UyZBFVTEyTc8@^T_Cnvu-4y38~l`YcY̳5v"w˰ KXe3}:6?5&|B-``0YF'6K)TOdR)"{4Ơ/߁)tvm2Hמ3W+JOʈTG^KX>7|U)iV!{UIYe&I<Ev2*kT4hE\ND / ͙Brdپa)疠.t9@'{ƪu]8Ul+IJ>kٕE9+AINvb$3νLXLʄ{v"S[q8͵9]?O'AGн~ +ƾEFulSp GO  (߮1Gʋ8gнk+ML(NJɃd08S/F%}plmlYoGi198SYg*w[ 켻/?N9 k>Yמ c+=#l$N4AjJ9as*7?0-)QH25£h2*'P 4~Z)%c:YoKٰ$$褐1h N\B` @G`ߜnUK>oQ@k\nZg}im.}xX~{a8vݻl]](y~!^Pagᄐ1իt ڏ?^۝Y?~"A?=OtۧOKTﯮzz$bͨF1_H,1h›?pWË@l8 Ky+v# A1mDVChFK ]aɬ꿮鋟Ȫ:m0*K$&%IW8nmB4Y{QuMkˬ|?0i31O^xh F;I!EU'B8A3r*{U g!5RdHL'2̷9R9ɇkS.`Rz™P Wbq 9y! yL[igͱh@7J?iagsc7QW.Jݡ]xKk8JmE[$$1MbZ#,UJU8Xmf҅V̾j—'?-jj*ΦG~nC3F+gGIĽ l8U~z]M\QΫ*Ey+8wuSyEWxWEHx42%"{X0h㵠Ն'Xm%u`X$BX$_P|uE Ռ,:%mқX:\TK!sp"`~iǻS7%\ܠQNspĘMIYmHAtJ.%%33xCmJ cp`S!e`uF39RY뽵;kΖ*^3{RQ eȁJ L:`.2ou0M5K%iHYN` $ ][s+[%< @UyQT/SyKD{dv_^7ٔȝ!ЃFwZR*ʘ1XPSVPl,/ݴ2ޔa)j⧵Rݹɱj8@펭?6e&׷;yrǵܛ[ۏg+|l׵HHQDQ 6@ BDa N`1#i!ѐ-)N}=p HŪ]zX2T9Q)mNFRjTk،Viӊ3ԅJՅ]xC4ۊ..#ۇ|<۾:?_6_rVB&&`R>Yi,~)pI zEm ǚ@d$)"=y!(PUm`0rL v\Q2:Sb.ҩ[>Qc] !z'30eRՖFsdS,? vZg>}?$ y4V NFs&܃o"_'׿MJ[9JugXojGe:n?&}yNx_26r ,*.Xe {b{w3A=t7pG"?~l:LwlPb3n(p&%D<-ڒdS0"%Txbo,< %ez#DK-re˧E.e?fݦ\=C/C۞'r'iIW 4`2+v_ճwwV3?޵}8h[T_01XA9s9^?-|ձǚ_ݺ{ǥ7<{_PuW=7O9kVA tO|IQ1WzyHz0YH(0$re!,E\2IJiv *ӳ<0"퀴 i}TTP(A]S%)Is9t+X$N_t'Iv蝁;@"P9")ҡ^a@HD) lr N/M Y(sLז%JNU6h [kF}XMbM*uۗ”ou^Ŏma;F~𫫾[&Y])vlHKe]^n@PR 0ъ9%Mʼn3jG Ģ LrWIrvL)Zxf%lN8rJP˵#(#~\ZH/g[ 8!$qrNQMX;lɘAx9Y/WybָQ򗐥CD@FiVy2\ދ'?@Q0x "RZ3 YU~gD<ֆc Bp`l*sEaA8~'&3Z l3-Fpd)Q9lHʤ`5"dmrAq4QgķOuNuhyN~=gA3F i(qDyQ"&тMCQb^ %&vؙÎHv_m" l,|XCJ %фO6Y }Qb;{)YX!䑜J;b0G;X*ߌCX:׸ z;5|oYg}utk6J$j*!aH6yd֩}*M%D2x];,: ;!Ue! (;Ts<(Zrm$%-9]ZH%oQ [V躭o[_AҵzR)-/B ugHqeCu$'R뢂9Zj.G#[6:Ҏ3D]( 96x1Ÿ3,"*si_c6<^֍miu] kj %%e1ZV =r#NKD,ـ`x}< Ō.O*EW:.,ط6y/VR朜+s/P&YD :M`i:ȎP.EkS̪X^ 6l) ;,Mh2l4R[A:jO@ Ƀ`(2J)F6 XIc9[XGnSZ맾;-:?'X4 '4Ö۫qZl1(~Nq.G׿m\4&wY7Ua盺0s>jX]\ _G) ?ǣ$]fugƦ}2BI!Aq% $ ?a|w1pǮ[^[Kl<2gQ|MjmF.mzr/U|۟S{[(O|)O9hw.w+*qr^}Ï:i3|<SԅTHmu˶}3}{zA3--nkcMŕ /?xm )qW2z~ y{?Y/8OBlI3<ӇmӨz,? pZ[M LmD˧^.!RG2{;Wqm[{yNҟfD[G [M*zFi+L6[O2޿!*C1QC,?1vH'IxҺwr?LBT'9y)h_L)d絖m$ؑ2zJቓ0y5uy_ ~*0 JMʚ1GFRr2!$sVx╤yi:O&lUCj'ȷ2W757ׅ9'}d5g:Ez &n(99*W?fDEb2ڲ%,l DRP:+YbPJ!1Q1^lsLT` i 2HGD,1.j =BVG݌9;2x߿[)oe{سy+4{B~=[e!{e<8ۢ"q)yd!; ;_(X$fC#si$dϨ=FB t`ƚ"(֜L(Co,2L9%}dbCS*{,) ƱY9n Os [˛տD|v|aѓM+HL_ȗz_Nz>QȨa(0.ImbDPGPR^j+I]LvRSv:z,C锖R 렐 X_7W ǵP\_{5fOkݯֳjAR9Ӆ?i RdNtu_ʣj(s&+Z"[XzR^S*df; %_MhA־⌆n@[Ӂ7gktcO?4E:9O! iG .)Ka Y``)s"{]7GSrW_,IV%fdfdUd 8$KNK kkID1di-dm֓vԼEPyF(FT}g "{)3jU(NՑr XcJܼ-$PWf3/-w!n1O-pk7oM:ͮ`9l;;b\'qrU]%UUxcA#Bã@ E2hO4JT&7oYf ұY%bhQ"D$4Z&,} ɤ)1$M,m0JbuY"f$&.?&e g.z0ώ;xge/=/:K8S-}OF֫Uz)]jsmW|dsF=D\%W4^@ؼ4\]oQd𶼳o1}>0} 80XKNHyBbϋ~/|)-޶"èvo7>fҨ$_M;?c7-{w|{_ }z!I/7ϝZ4D>=Z~^^<_#goevV`Fϯ|u'&rɒ M Z4P vAb+YVVY^YVvʦUB&* IahʨI%Wۿ ݿS`ȡ8#sJXժҢ ]C5(EVwj =>zXTd=e`z:U1HӒ5֬ZN/SѬZYKqgO0_zKha_m5J_[(Qix6ճ ? TL^KHDg2H ~qy3Y=\m=EmtWsz^/.Q =R!ysaeN>1g7;^n?a{/~w<8l{ Vޠ&l&y?-?B{0NJ$poic^?m>a䤏.]J3tm.L^?7:dC#N/nJ/gYœb}vyP1ޅn>uEX^fxķ;|*h^OuH}']Fw£6zx|,opJ]wt6`\Up٤[>9B(*$WV<1fL>|0it߳?xjŜH^ޟ[RQo>GJuFMTmmeY2jJ>Im_>na Ul3kwւЉbc~I?5(A)쇹t\ DCN CqmMT$]js$LBZWkGN%c1$Ra%&u+2L.);i-4ב$Ք/fL.IiR GE5Sa4:AI(gx*D {7X͌elJY ol&ۀѩEG2eQ}h9(+׎0pf ?Z&,C됔aaô~E#4) oR!TT2B*Ż >劅Ʈ2:ּk94Yٚ,Jں{x/dp &2aA}<{W/NObYxc*lC ^%C(d[5OFHq\ֆIdUy755ksH9QJr-^4kZ)*!wnu}ds9qQ.9ㄦk~VYb9kOeM*\ J{+[lڕ΍Q#b& ˤ0J!B Q^x{%xs"$v+I%CLxeHl D9%[0݊ NVh 笀>6ԠӊEAh7x(Qԏ;ʃlS/2Ɣ'-s#[M`s2qF0P iZGg*){".3 FU4r1i0Ii^ȆD=TUNҐK FPlQ = a!یBu΄[`wZE~ŬQ@r|bQBu"wi$% !ĄLD0,|nM2OppǺk< KhvR?s!ƫE G\‘ PLdW%ׁіF@˥W20w:"'icCE:/%)x$14VWe2l53Q #RicSwƁ~x 0}Izp[#fH܆`m ]K7ͪd!*T?y*CVT%gB]V3I|ӧU6hu~ R;ȋY B-B @1Bik ) ` 2{(%g&AbDEKI(D׎n~O:IAL\Qha\5gm}µN陵Z"{ ʷҔ vob P[;i*>@61  f`$!8Pb鹓&k\F9oj#\R!(xR o`j?m[z#U[_83*:ePC+Nuԗ}r>yGB&0 !*, as(T7'.-;qTAehHnpx:$Uɐ@Z; ԭ4I"8aI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bIB`S"LWO2aɐ@GO˫$H^b3 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $KAjH  !!z |$؀$Hl'$@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@- e"O=X<[3 =@];&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &H?kq+ dϫjG=mCs|w!z#ޛ{y:ӮAv]R:H'?hY/F*)#! ;@a7ZF EZYwL^#V;rʦ1QUR(0@@.g \G&~({yu0bLI;{@ndqij32o rQ@i:CC^A1{:X//OmmJ׫O(75Qnz;jVLu@p!wmI ŷaf $w%:! ݷzfH(>FPm&A$q=]UuU5)D fLNzu"%X)m&$'_O4gq\ҥ\vz%H GI a1ޓ{ 3l;ޕ?eq8< "gޛ\ʅ(+?-t =4/}BUєicAݫ~I]pc5킁 `' !H;NؐB40Yg+MtFHF=(:cg,%]QT2O9AfPt*uZ௓OΛ'o3|ށxÊaLkס:|)`Yg_٨&&y1dHWWr#TU.Uɝɹ]6J]'6C h"4 %S ""rEAS[( m:iG_CK߰5@;?VXdjطaۛtL7ɪ[NF#Z$ce3|VrMͳ‡~vY4K)nA~y @)T(T1' h=P4z>ly8/]#2L'#Vz^| P}4*%oMy'mG_zT_x#~.{NuκJ9l|58Kaj5[+ZSҾ>}! ǣS~Ppa `\ST1M;93!jƷK{`n:Z\HFQIpƋW/bN1^UӠ3X4ڜʧ7ht|hx%+ s&,%X ޲S1 =Wlu"lUoYCmg+VtK 24|9Cyy̼Cy[e4[ml;% $% )U`" t\ȴ)% Xƀ-Y[ÝmX;[f9͠`VHtFq0SD(_Fq;u+ ĺhE:YEe# ੖&Fb p4G2~7sҪQ2EYim8ee9$IA߸D4MǩA`"Qy$ԿT)"V:jl5-A7*Vk[Uy:=Fz4wF5DL):A(b^1j1(c$caxD~!k߰ofB[tt>;ILn..R v>7uxZ=)Xİ23-0wo{?&%yVGݿgMAMozRixbt/a5AzEPx2"aΚG+ޱ@55yZ{q&gM+c̠:,z 9str$O"_rVǨQɋuiHa|jA% Yߦ~3<9;YINI]&J o{a'V'VPkvaaB4DӦS[*3-g߷;eK ޚϚi"2`_yW"k_%z]7H,Z>i_tv(_uƉMR{΋L},z7`4WQ"e_ѓӒGϳھp:FmsX7,l {d]^-4K{}!喬BQl ̚uZj3^GzX2վ>h i&}dA/U  MZS ȼN? ZQg5}&qXL1p*(;λ)k%ۗ*2&tkkb=zz^b[\ [X~,[ق+_-',މ<)tNZ-)bمŔڅ/rpejhT&1}Bj+:i9WV8Oe6)X:Aܦ+Ӆ| IdM:V3EKۼp钖{%7Rs>bVtE<չ Ș׭qP AB:IFL;h;هs&=,U)bcуcZ:ʘ;lWd#n!Iut{F }nxjt:Ȝ!rk#4ѳGbXL㞨P1 z4,Ogn)up鹷Gb !!4(/9K:Qdq*:lfakٌAC4!C=\c%l=/q|!@L$ho- B9Ah;~Tt{Y0ÒY3&b<~v{KVlZ,}'^}Yc"In8bX`9(WR,2ʎo*{>)\`*?S\0;YSljB;Jrx'Y P2NJhMCP`Z}iEcD2Y+냉KM-@Œ` Xy$RD[}6ָr?~*6 k*ݎ]_?v+F>t9^Lvb{_aRvwOC $bK)81"ޣ@R .5\j@Gxbb8*GP 8Xv+8x;rBdV"Nhfu3`ą񘴕cҞkΊ֔Hz=$.7j>uBw@T+FOs*tcTHEPw1:(P=|ڿAY:zů?OeUMF!zenҏ]d&Ϟ>S 6:gT hs IM!MQ0 Պ7=_XO%"( Ja>iOJw%F<KQ\RC"epק?髾Ϧ!ƓAiT$H~* 9s\Wcemg-Uw rS%i̫ /|^bV^bFl+h|hd9fEQNGD@0t5x[VU/2@@ Nj bA;3i),#,Ṱo-"Q–=} 0Y5lNzZHƇh`  )hBȌ]?k:w;SߧZ} KuJyhQȯ ^+O6tb_jSg(J|@u@5Dԫ ;ԫX܌!ǘ4ʪ7&Mاw]Bލx|kdeЌ.H/uxMu9Nfh|s6Y,垇PufDI߮whmvtaជJ PwkjaZS3h ocF%QE2ӚMMEv-5? у]] 崈 ^W} T?XIɸ-Ev򲟍L]bEe{M 7YVVAn#5f3;Mr췼w-a8*O">\sh}6 *~կ$،kvд]"igv^=yU46|ӧYɡDq2Ju/Ȟ|Qݾ<ֻʋ,*7wjf?*RsvjSRWo_<]+Bz]W֊#w m^flڨ.Bن2⒔)G3oeLrJ͙"<&zG;2 =,UxFNs v&W~L&7F+ནq’ XJl=kX:Yec0#Ǻ{\J~7/F4+Qbjƞ8KӈEdQQCQVZ#>U$ՂF  *Ki;[Yo[缅!nvmЂj@##e΂SQK"mDo7A-@[Ȯ`E[kY7#)\,ɚǔ5ےũ9*N%K:()i#NPA3$v 3YM**T~l91;[MP̅PwIFdmu諆Zt"hyz̞v@ºe58 7FGfm (袡Xn5"U>%OQפmryH;3[!dIjG*ΆMMKUg7O*[;WcG_?v%GmKզ'ܻi\iX\U{ IuZՂlYUJ3Rٓsx$Цrr]JZi TF\P֡H$.VLS'3vggUUHҌ#}}}j>͞ۻdfzer/O6'f=ppI L+\%6L@ۮs*CߦJ܆FRKd4 gxE@v*j02FW3Wq&~MW<M;k'3,K85[cAeBc Cb Z2V+41S 2(e`/ΰ p%?HT-by!08HoJc<M?#'K㝰T[6. nDġrH`^qWΛ*s>;"XP< ʂE;0K¤ zH^cn;i:{nc{D~=%;wNQ1WmsB  IV@t)ˆL%&WRxS_|x*v]gxd}3=M薍7*^BeXy@TBՏ~< U\uܣR#vŁVci{ UMOBHBP$2R!x,U;I8R{T-T-,9q(iBUXNjJvR(Q@cNFۀɦiݳLn>d7娭v>UӡY콓NNd<)TQ KA>!S쌭f2>xRQS/t"ip5jkC 2ft Ƴ)),ǮpO9`[fm*^nQX_ `66zz$KU֡87yBuЯ"\nx!rQU 3K%[Fj)r&h)b)O)0DRK->!wlB ![j轋8j ZCXNXR4%:шgcC1%H$-H"A\B<9ϧ8-} Y洳}N X> J&5E^9T` M62khA(0~E:)ȩ^vgBHLJ9vO}kyDCkA}}?J~yR6vG^v~xtu:S.BY!Ud%G#_eU(P2f!%Z^PM@[q6 h<S(`I}p=keb<ŀ2fr^ $"l&^P~q濯>\]ڂ8O ߭v60c`kW9>Z?]^;׼nFs#l'BH+xِ1u2iB]?Z^=sIk%dc 6dsA9U(I3P,XXlVmB vb jV0kT1Lhi+[{Ŷَ [ݶ?AzV ;`7Y.KIC@e hZy Nnu u{5~^,%y>8r |GNƆ ̩@S-:ͦ^+=8>nXLn7ɯ TR_-ZOU<` mbz/y[;{y_58y7Oz ^0`N0Ch+_ n׫cx9Kcl;E"]/lu[g_Tl/6$:XB. P '؎ӧVGQznm $bElyk)F!eu;Zz̥Ӝ \(#&Wz||wco$x lƋ7=J/ocql}O/tF$_)Z,a,/nR^n~W7׳{x,?ك=^;{X{yW=nٿv\[i_`wzz룻S2_?\-geܸ7QzGF5ח>b{ǿ#{۾i }\є |l Xkm3Ő.|+Cx0G7`hDfPU%a2^T{!<9IsCVoΤcm3"ltB8k(JB#Ct@A1yml%B>@ډ@f&C"/wBkٟ$zNijt%a!ysށr߿ִgnHo?y!cBz->R=d[JnqHUTŖK.NnӋ9錜>J *-k^ ;lwUPsZY{EU=KSoV'C3 &#h~t6( d.WWEXڋ." %X@@u!PKVIC9j[)P.I GSHk! rj/f'>J:[p7q[8'n˰9jkJMr 1K獃*]?!yMbFRr BHP$I>ێJvДf +, mD{6ە)i1;o\Ru'y/%"@'tJ8+%'FCBQD+$XpJ=~ړru>q z~*շh( QFAiѠtzOm%:Dj@# تuWtޫiؖ,* 3)QXh,# *k 1= #j%Ur8j|f֞,&Aͽُ=NХė*oa Gqw/^o0n+&.3bUt3Ɣ #\Gj, X0ĽEWk^bwITI!9-"AiCB3=L0lTM(2mg|bEM`6 501u@;ϟFE~KQ 93V%N; X|Jd*OPM!H/9(d=MEW?!Xl gZ\"{Ȁ*Ηd+1\V7o/'4W2T#I!J5F3 hlGqR#!!a%岮. m mR9`D,#P\(]!!oBxʈẗlwsh-?b|TXӝo=g36JP-$4/^"qp9SY`cI .`2~cR`4X0_M~}2lF}SF P S8qJu/6YY,?SJ5TqJ0 <(m8pYM({Rr+(Y(! ^-\}fk{/ Y NhږVO)in4Iη<+; :J;W_ņJb1+jP9)4$U VY֫@$ˇu/GΣVX oo3`xyq'ɴO]- aQn>9< =oFbHJd0vpCׅ+bF6($˅]997^;*ͣ.orݨNjg^HXpz~ >_ɃӋ4[vl.jX&qqA8嗟^w?~_~/?W%/W?~˻_x9Ƥpm545}jho94:uhy׋ϸ-w{$x ~v)qo|Q_\̊'tUs=Mi6Ȗ뛎8n{IMc hn9FtM|OR;-$cT((!wTHBXo5RF}Zb`"etIpX=8Y*ӞZì]P>1 ]$E*5*k)Rr2̎g^6$u;>LM׀4 ݱ ݱ_pe u vpguE9J+d>]`ZC_Ae}-^{4Ӎ>NOg?} ڝWju?^,wjuwaHH`Q;99,kjH] YJQC|rI^kECP8zWq\ey~㸺8zWq\=q8zWq\=q8zWq\=ke]#ۗ`Z>89WlǼ.T5IVyfpPA@`g )QwCk ö5@5 f-h7n+Z `坰WTQAg%RBDVeTSpxYcyI I2A6(Iz\,@dr+@cpl:wVho`0~2X{Ko|d7s!V_fGϮgyPkA p=7}X!-4>'!.Pu|N6AuJi,䠈DQ8: EMڸ]SӝFjK?[ͥ=B K 5Y*mFEm)҇ldu QD!dBX0ǔR' Mf`!6=pv~}v*sO/&ovd ݸILCx7?8g=6ԛvԻO2b O')4&OLN Q&2K"HԅoxdȨ^/%g.QfMyUC":>](_[qЅ, A\)ߦmf 9N2LSe*y0Abz%Axtu +`I+&1 .Ֆ䙋KT J3ّl< AE@S 3^k@VHD$rebPFLj{6t$q-E)GRҀ@"21>dR:TFm03Ќ1zsf́)Id2)Z}M9 [ M񓒒|X"kGCNfYq ˦!`T9zc*zmo=`Ŗjgޔ *{5EI(6fKMj+b? mw Jx!Ythb32౱5=l26*ޗW=?3X˴e}!upB~0D$6 ]ZEv_Ōr*Gq&b}" UDMI޳(LPԊΓ:ONI YL˨.cXJ`:Ƥڙ"dmr n8Jo&Izpg&o i(DyQ":x,h`HgL%LZm4j<}?5mO6VeyX)PI:5M"Ps9(YX)+`z&G;cC<=/LBh%:A!M/ħ )ϡ0$j2 Z6T|pnPJw0QZ9CH=xz4јGGCr`f,d*8V']BS\((J,)tN(%F> dAVdO&E3k6e1k6G^&tQi6p}X;wP&}8)OTͱ'BGZGG.i<M -XBd*"HwhMi ;{TEA2+'KB4.%󩀰c1BD_$NG1Z:(]H)xI+( g k9,!b"^}B~;dx'ks۵Û}a4Bu~}GpL(#d0:xegw sװ% [#aqP@␬%3( ;ld&I? f[H0 @g^&FWѨ)j%O5 0,j[FMR0z>$ƺ'eLRsB,phHՀl]FtƝˎ=gǓqOOxRDL7 E"!X)tw,I$ <B{4鹳쓦/\SJ*]S$,D *hLCƿd z׮rwӳv3Ymyvҕk3r`g0]#A9 Ζ o) ul9˭wvVxE`g3L)vX#5S9$)wv#;]U;{]~'m]6[ 'rzmٍ;p;; h Dʚe7(fCP"(^gv=rbw lh R1 :ǷYd"gye-` .H8BKݵεqiM;sD4>W &&KIxWRS/ 9 exJ1!s*:<:!!W& 95ORVfT鮉T!hD+3B򤤰P`",rʲestÓ0Z*r)لbP+ 5$Q/uM_>0Ɣ7}88Cmrt8攔V/񞦳d" ɬE0v`]Vl,Wl.!GLSJV^.DXM:( Ι9Q*MB$Kۂ:ZBmчDь&(i*F)ADI;;\)q :Z:ps><::8m^woųt_Fn٧yϿ ҇` ] 5v(smQ fV!A* Rx(+?ޟOjr fN7p 0)OԡԳ(T~RycB64>h,PG6i B-I"Uwi[EښSR6dQK%A[``CR"B Jil:w;&Jo髚_ jzljY{tƙELެO>ylw ۶J$*Yj_)FI!4, )@HA Yd#AȀL]y(};cm4~<e܊!4>A\֧g>2"0ZcȈ(I81 0h$o}_5E .6_UWWTHEPw1:(P=\aK"jy6gP(2tP qi,'dX^2yh׍y~;5Ǔ_뀥LP9OVPq `G-GHea ȷO0+}XWE˚ ^(7B42^ʡ5{ PR3W2fjl<(x,h܃z-Ti Dy /jj4.ٞj͞i\Jzq5/ߌץw~ n:?`yα"@?VAꇴuNߦ=IM_꿯0\TEn2C`Ou`&Q%km$uo^V/&ôZ~rgP"-^~(?өzn>iJJT}(Ð~[OU~|\ >y,xɠq4}Js(W}٧Iw<]L1J˪c0GnD4ՃxIe,:ux^,qװnfn|,k{>{xl۳/O;f? FlX=Z!"jJ}"uc4~hd5#Մ""Քu{|y~c ^zLp>ʰ+t>iLowFvL5-L%?زWa-~ji.;N ƶ^e1ՅxJQK>].E0c~In7=-kz=x 0-7@\5OEJl[ȖFXD+FP]Q{]_Ȼڦ_qdihF?+2O_q{zq}'?Rwڦ:kem#0:vD3JzݡYo㞫m ;73mwYUZ2-4k{zC4ۊ3776~;cFiF-/[~hNg?s'e'!rX8P rV_lxnCk1ڎ{C|!I"DQD+!9 MiT$&bbPz~C2|ӗ뒭Mc=P&F8&,_Ž2`{N ["$ ȣ t} O%syqPPg.X(uF\x \#JMF;18!-׊nΑR Z3L0".)Y޿(.N(˻8, KpdFpNP[X:Yec (:ìDc>nVO:ޏqvJ NQbj5 tYBpB)+ GE FYiM\nS՝KΟb Yn)qO[ w}$J(S D 'HiYTt5!oqj!ɒjD dZ`JX7A1ox"Ftƥ$N"Bys[LG e31c@^ˈiDk45[!-]:8{֭*`/#8mZ0T"`X$!X,G)rZaՇ&4xZ*O]9v` s;C;5nw?s]N\虠\YM"ؘL(BNZ!`WT28M*BĒ:U3 %ԫ&E H1(E-&x#32x5sf"(ܚ185c>$8PWWt1@IB-0;hȍ6M\cB RҠybχ#X(dNahuhd0 ZBIRTH'KA@vHHLBr%nm=D1OEkgjmYkӢ zp@F FcHJaVK@gZi#˪#‐,dHhk|@N@AG=<008aC'I1F6?Ո,FdE#m `IUޛZ>(+e%)0f՞sIi&㕟_*D=l"Em-Tũx.NͷUY{:߱筐[EKhD,^w|f"żAj]O[g,s_ѼS`a.v޹m^6Y#·a~bܖ5YY mR#׺Yu2욚ϛn}|Ln =V^XҦ*~3LjJѢ:3;$62Xf7S `T}I pY8z~2x10Z-C!fH rVk/;^sXe[ ^5=ZdZ9oS=r6T~|:2ϰ#Wa-]ټ}?Z|}S>WM>f[~HLg-O nwىuk M[ ڰ6ت}-Hq@ )~4uP-u,!][nu O ʠ zVz+O2X*8ʝ7\ \ŀ1@X1b"iZ+I/X+ֺ;j/о'}2؃}os~"f^\?x7eh_^Nz<嶐d&N7<߽W#;K'*LYݕzsLldB͹`1X=!Rh-:߱vLE@w7aNaas:^\Z/j3^rHSQK f.-5z5W2rv嗦sQj:$dzW/ dhz>$7cөﰶapv׌1WgKE֒)E 04{1*W ߧݙ#Fymٹ 4k?_tp=eBTyլi'ϙ؃,Ue>`DAc̪~XHiWJaUP@<@Ab !.Y{u&͚k1 'Js"#KG=ŖDAqjn9EAWBJ6ڢȒ^ʙIyK|2gʒ3)oSCf\渐"bm(FTq%q#򶔅9J])ɴdb0̄ihpLd|=]~MbPW_!5V;ExpQDWtj0^qʐXF4sa'1@ƌAc,6ʌ ;^T(KIwtʼ{oY`bdWX'j (%Bcyn|0'0#1ɣ-b_]J=Hɏf[Ԍ7R _MU7xŸC//1VR&N 05@0A*1jyΨ8Q><ެIt>t|*h-(v OO3VZj&!vr<(ߌ  |<9[=(K;f=ZhwmI_!e>]M6!bM aTDʎsQ2)H1lKNW+N%2B!PdDa} ,8tJ;(bP1:r m2Z%lB-XqH.zӀFQc/ul:*CƩ&{kՉX |P':dhwK_Wyiڢ|ӟ2^nhU.)`/=`,RV29"ْvq Ӧ!)ycZp8m N8coJEi:њ$cw>2CIl~'߆`V`2X+0^H&dNm ߏwQQ(#-d1(/*b:Z02H*69=" ZI@uB*G9JItJqh:c!x~D#,-:VgA3e I7m}mk 7>}]Ƴyfj:LyczzEhbxe]~ZܯD''dzC4Jo2Q*ry $E'.+gB57:7iq`~yuM̋ijupa2:?2V]s0jqG4V m$[GҸymaN,/3'mb}[&BOgw<:*G]>Q=R^%ԁeۍj 'Xdko]75Ll8_~=ۋ/x컗%/ R-m k~ OYwy6˧^,..y˸]k֠nmHWx6Nӄ_ڨG \kMtUq=΃ Ok#-n۳2^t[C:U&sM|DN2FgFC WQm)ũdDR{;Kw;Nt੯ְա\Љ >joKP.RSs`Jl96ҝL2/vhzG)gW2Ybgm#uf%C> s|aj7,>@P"t ].5%d=E!(%-^ڞ.i#'൯ߎq hk ݁OTVytpA<.ۀ6WsH.4h* }PSXAQAQhALe0洖0~()Gc2yR']VsMَ u0 9)XyI d5o)zSq&{ҲM{:-V .1Lӏ\twC{[?@4;/O!W9J%l|GݤР=u}HMc\ArPt:!BpW5!bA)ɒź,TDc-BN!/|%E-KL>IGAa>5昄3AeJr'Z{[;(N?˓0;qC86fO;ڃu&cu'«U0o^1/1h+|?fvy1©EcSJ0jȊ<=ȩw |_wYoWYApv6uA'_o~}F?MfE#((c+?()ݴUre~`R(=Ҡh*P+ldEIG]w%m7M9kKEdA /F"DrRP!uibx9y{XEqϸfPvgdQcqv܂pfR.M͟?4),5o4ET} |0*ZG gNw޾ޝӴy[ڥ_rpްt6޾y$滛 J)\7+#jK˲`2l٬ހbGh%7C-?Nap8󿣫x,<2˳4gvu9Ωᄇ<Ǯ!À؝jM ^lU&^Ͱ8^otNvqgaѬ=Rv$I !ʔ &;S)ѸmBi,ۇ74 M$53'rFƈ R2D@:Dt^9r!lV.i`^4%]+ HE,Pp${߱qtvܓvCjXUJ_@?Fx7Z^wwśapLlޞ\5~7^ԙ#PQ" TM[4SXEa5-¢߹DxPiyAƽ`IK"BBK>T$QԮAQjlQѺCv{ Yw83JNɧZ6߭|ݷY3̈6dz Z+FlQ)(S))J3jΚ "E3K( nrv{ѳUK<L&(jmmM%2a#$w)"% jKN$O!*ݟJE%gEzVo(2!d 0_cbt!L2B hkqKqw9ӷzZl]G&Qkn _X#&^BW^\-D3Mg_:x+B+rRɚ3gN}IdLyAذԥY{Zmci64̓Ie%#|: 'v;vSʊ#IN4;u.^g0M5 i|c89Jqնp?yW'|yUy{/:߮wmI n7s v _e)J'RvW=$%ZBl$hfyJt=u :K%I_#|ܽ{~q!KX_gՖ^\-Gb'Я_![|C?@!W j,UA_v>in7r.L t[U9r.؈w) EܪIP$6'Sv28E{6e * ߭sӳYXDo߿HՎpu̻\~w{9g }M:a7ml}fef7Znر7ꔭnՊKD!#)zzOg&p;ZjP8u 6_+.ay`v2ɪAW50~}Z'fN8Y<:=@C'>dwC\3.y(]N7sbUok7a;R=cGB$E &屐7{TE!{'-Y01v2+!ut)A67!$API/`yϱ? iu-tNV^:7j`Py^%%UmZ @dq* .J#mbD]n|wcMMwlQOqW=w}|UB ۋm\|1E$`º, 7hA`lF;)9oaf&3OLݴ:Ѷn% DFpE'=Q;\"F"U2 ^GcmX8$=`wπ.ױ 3L%IcSĺɖXM-EOQ I$/KmiBQ=7Qi&LLĂ9&.J N.{ K,\*WLOv]aWX5WC]*sݟ:0W w^J|[?p]KpDE(WIOI4F"h NYUmAEZg4ΖKP $ajTw#@ٕR6Ҩ PIy3*MZiƎd[_ su"-!03yZL>LgoH#{`Xe"f2LcTe6! ! 0/"c+ cd/d16`4-xpzBoJ5ycdzYޠⵛiǮ^[5j#w$1bF$ 8LO KmQX-3|V5E3MLiFHQ32%E}M& d8BEg<ecL@+1GlzDlqGO T]\&ܰ^;B.(e!2T@X S*oEF%}1VlJ31XĐLЪ`%& {f<^-c"z>:iɮ~nУ_նOERʖ!nl7":̈ PH% dz6ӎ]iw̯|@Aea|0Y,Ɖ4!Ipc'~Z>qKIezn-r E6+썓}ѽZ*C#9ϗ!();,KlOWm$Ԗ؝rJVBG1"LM0aX 'ccpYV Ov10{.xէMwiݻPiPxq]k㤽OFX $-."b((sT4fWS+|,%&aE* D^vi;Oۈ|[ugtC[u=o[w#mC#D]sҵ*:x4[@YLp}uc!2p[j_֎||+ aT'w$j-v]Ģ&4%>SP ]ào IiZ`HCOȬ/& DR}ǫK?fY.ehOchBTR%A Ʒ yrޕTi:M DGd6G,RT)!)l^9&"eQ:L.;4k8O5 Ye%x>dap"P }D)l%T FUb*L!cdiGcycTSu,K56@Z*"*/لb* Q{ TzM0YL٭Bn4[ CU{3c`D ullXimaVOO-*'Ѿgk=21Iߢ#ȱRs~R=TSԵ* EVYf"$j_"EL&+%,%)j #G(D`1:H2J? UZ9m&à=tާjwU1=}cf~\Vk>F CL6\->7ZݍuFtDF{=1!-;S+`VwŘM L)2BSB>1N mxF7%lBG_qI ,z)m1$Ժ@3qQY}>EU;/B[F{0x/P ybԔ ҬZ3gɖЁb-އRƸR36u}`>yS/ -ωV%6jU JBbPZ6)ΐAsdE&V GzLJPR:c")jlgy v/QT>/I2y~g^ӆP 6N`a폪/ !x#Ə&|)rԏӟX153wPkL(T蕱9@2{I#jb:'}&Sʛhj:k ^Ět{;>NEZy'넄1%Uh Oj](% ΜcwQ/$$qL@ #,_ENڷPQ[Xvi!ND ,`U}]! Ofe:9y|<ۍ g'/^ÝbHZl #nMe 1lQS3lz~1'=7@oorۨU)eolz^wFOg^rt5|(dwRzˏ EyE įˋMf]bu~}oo?/zq~ J<1lai*_?Hg~jho=rhy׫q][2nfN:R//v̗êG]l5O*zt`u#.1HRaoB:̓ q!vۖ1P_¬&Ǹ#&ēYK (y1m~F@˼N9b>^O'?>_^L585c{[j?؈o:+aZhЁ'WCsC3eG 3lc 1YR5F!T^bܧWg_oǯ}ӰUO{ۓmw>:їǻN5TFQW3#%= DEnŪՄΛlb+C n]EĔCֆRsRX(Qn%H:NUb A\E48 ʳ~8\ˮm|-{v~FW,me`ӻgsAz}$Hp6iܤU*.V(ɣ{'HgGN?|,דn'_!]'ށd/IYߡ:{ƏTt6G7=pw^ec~;ὗw ӿ/ /^jf A5d`WLve$;R VѻjXJu<ގ$|:|^J42{6BH70|X|~5^t( &N\}tuG;(ShPsod60]OKDhг*`3.` 6Zo zsmbR`\xl.Hw.CT)}Z# (!I> { 5ݶL-D~uS^Cp@X%EkMJ ZFԔ$QDa81dNb~HR}-`wM\/ g@ũ UlQcT#- &f&1piU&ABbU6g;F.c&cASxz I@`k)T#rl-%fcfk ~ኆWn ¶wnO-7JM&L˄a-#J#aS5eJi-Oآ8ȎA8֎ph;tRd,yo(R9B.zI6Uv?#W X~gy/. ;z#vm6~8;ڀtcwv0pNPyOjBu'RẉS8Sej,|jƦn3QaOt>ˇJ%cMLq)e fEZ[^!̍B{k,k>Dd (9XPV-6HJc 䖊DMLf%ZlZ (L]J+{bV͂ZsZ9D0+_? 3"g&6OWcޫSlb{?En"4h"QFMGEormTAy *\JVIeQyE]6޴O3ad/Y7{_E+mӓaGBzWru7_ls(*\+Lҋ}p8b$8 KևZwqE*6ѕ?!EUMBE UqzTJfj[ 68=hd6WrUaᰙ'`,<(ޏΌO:or|Ϳ9aogO){˸ +PJ4j%ph $ߢprD2lKMJ\r6tGU/38B,s5B3) |R(x>ܣ2bE} 1fǾ[P{ajlSjV)o5{ii"mh}8N | PSH6_yp  Fv޾{{:AApRA (iR^+yaPPR]ͅb3.]ҥk?vpSL7WSRzҤ'MN}1g*h)״q}cO9;$IUT=V0.`Z\asb/y8H|{H=k^D.@O~uKOwb uJOz(VH2Eo:ȻG~gOUɺ~^\&6=\c2 ) ޤn /}|zK5rqYWJy+j59Iz8J-9]!clRcʔ]R8tLTPȒF6? <2\] tF$C&јJ5&pƑU&'p̊q Ma1~l_+*BM˦h5Ipr4v+Y4̭Å |؂^]^S7h-8 =)\zfI̽QbE sbucXڗsI7<㒲 '8i,YKP btmv 7#R7 ' Ƶ0VlP)ck1} 9V69D ʑLAesox cS,ۜT9?C4N`Qy)BOzNn93It9vl s5:>& |#Qq`̓}[|VVOm"*pBFyh0yg#!e C W{OO5?+? Teulf=\jum݂B\ߴ ޖ"Y7+Ev%唬۲JDN/uN/;דoZk76 ]\~ؑcm~\^=b^,iGrח;sޡ+odyP/z>6RP7TF=ӭ?^@wS2!MMe zh{Pp,`C&t+2$4kswr,\BN$z c{ۗZ}lb`zCiVuItѣ$F [BR6Vח3Umuӝ+^~ؐ{p)y?;O+dT(5u:cVC 1yX$f`+C .fSkê)/,`pX#.mMï?ZlTM3JFU䧚\͔n _1^ ?P).]֛݃!+ɀ-Yb˨P r 9R&Ҳ ͉*vg<]|Zߝ|էSe<_=z}?]Q/q샘h|hW쁂(fM NlLĘ!}.N`x`R zu:mA2g& JhiZbV%6$v؎;#R&_>@CmCx2#ңp336,H|MH7H1H$]A؇Yg_X`ASbL %^UK!r$QM4{{~] $\3Q" `ZaVgY4Fԝ@QӖ]n'Κ; &xb.F>Oa[D)NN_1ǵ.a4 %`Evnk>i\dQ^s-"DS86='[1Q5wӓOn|y;9L:'nx5t|dzp;,`+3 ;g_ϲy:~iJ4(R$ ~"D]۫?ߩ TcQ *sEt٫W=Ձ`s;׆^IM>'^'{3%Z׃䋲{"kn}M_ :_өOR4Hx6}N> yVMYq95>q:GVV!/}0M?R8:(S8sV+VU{be7]` PDTeӥOe I$.O˶YMjysdӎo1e+yy+ =} ?Mʩ78)bd|8KD=%N\=)ڔ[ZlP}\/ycm>q8˪i)9x$(`M|>h"}, }L '\I0_0@u+j25[MGH,~^*n=N;)Rm,l3v9;@18?}.nT.fjT ?o*=Ƶ ^stєAE\Kۖ]E֎]/0):BF4|hYAvwŶ90JueE<M[vM{y&B,Zn^`ChpmkEĞU[:"W&¨6Ynع}xwkU{I{ǮtHp0N#)y)B؄V0#)6gF\ncwvY./ٯijjܷGYvm~FOmVH ݐVJG 3"+RJU7  ]z4O0ߌ;ߥͼPw`) :X9 "4`BI_k?{j/CW/ R|!S |qW5;g$"o!MSxEgy{x&2͗T)+ Д,|ُ'{wA5zF5⋣l 6(h2통eRշ{Qb#3Lq/1Ę1 UT:n E c/>| ( h|b l^ H33b9Mr/920]^~kc˿;=iYl^.?BQz 94 Tm ȴ)% 6PԠ0rHRIpxaN˜'B$!xMEްOclYCˌX @^P$ۙp~nͺ%3%Wo8`#&&%oN $FrA^)hlQ^A:)]@ԄƨU1 L,<^jF5+cMrLqNirDY瑄 $)bxD8R!` $6`B,{Bs+Vq$\@KhXk VLW.!Yƛ >8sE|XeSbEG«!m7o饜z5w 3 =W(eD )# %& Ob3?fͼq.ZpI"*=qÑ,;VQD -ALtzLPm߂m fׯgy%#ƭ%捯ɧ+(3+C$O)Úx F6Za)k [2o#[`U)6Xlp6ia}C~ .߂o[.@ =TƥC]Z{ܔBMұ ,tyӖ!;ЎsN' (bAf3(jҙT&dJ݌@ǰ惉խ,.Whee""cJ3iF "D@\rI.L(J{i-W˸ `x`9OHr #'Dm!!*`-f; v#K^ɻo.Rwk*y%-'1_OQ]=U}`+XKYdepK32DwdZp,4  3M.}+z֊V<7ɣ9Nt ,!Z@5 rAk<8#XbidiD$n\/t!8NYriOJIIRl`;A3<|gA rk|gu xp4Jq(Snyvu x u@D^^ʇ)%▁$a\-Hq7<db9IiGᘍ;iU&@ 1s,Pfk4t42b9MO8:|a狫eY5? E2"|Bh:ˠ'w5i,,9O-3˳?P]`ffbi?MNᲘ΄xeڅ_L.u?Z4}r'5>|~:Jl {cuz3s 7"Tl [[כNK^?k8xu ,}kϻ~Ii0')Y0+ Vh?WK=0GK 1>! Ϣv0}8h \[(:) G OE :{G+֦0V4g<*Zc<:hIgh$DRWC[89˾vA|V-K٧ tO᎐{UIbgN늂'g=3Q*-K`/Oe'O4E񹊔jؗ&yw][or+M79"A%#`b,)3>@)\tzf4ÑZrcMuX*뾊~q}yuy g7/հw,4,vnLe?qvnis[ݰ.N<5WYp$SXeMn3O{<x)گn^jI)-b-j<|El [~f0fާgmƌ>lIu7W6-n;/QV*צC8A: t>U@J %K'G&4hT?ar#dt:.K,F;仢lvG2N|j 486W5A[]td >jQ,*|$!/`G2;·Xm{]R7bQ(P9ɟښ6*;MVFiŌӊE'`$Ykz.;).FMHF6K@K H(3Q{}g]=8}ڝf9AgIERCTdq ,RTuک;?~eDC鰽q˴2-(8zc5XC4BHa;0G9_L]Ҙ X[xփ>`ob#9(0]ss |M[^X'RmڜCF#'}ΉwDR1L %P(x^0z>6ΞV="ثao_sT*9%JMX ,*ASr&I QDl%HݶTFQ-7$ k@1kSAE3+U9t"<.6Vf٭Ӽo [0*ZWC]Ͽ_] ˷Dus:-whW,.֢=xդZZ,,R#Q38@!%c؃;HEZӿe4Ζ ƹT@*6=UT<rr`9i H16R-c춌J Vq-T-\o<9sz}S@fvyO̮Ϲv}/9 Ϟ3[`]Z΀!L*$0/. f[AZ2:)"P8%RcA2rg6UlBeE_GTBՏJZW?t1xMeoQ);$y;##U{FW!xNX!3Rc庵NS>h*+híG}5녂.A׊(y .4Qf3u#Jo)XS[Q$vɝu"F4f@1@ .0(F=a(f+9KDG:1%yZ:4ΞƚH}Cԭ;_˻_Q=PInk>r1 FFh| Q-Cޤ}v) JJEl:OU~7)˘%x|hap"䨴T JaA2+HP(!jIYN,kg_LiԛVqg +d*r)A /2لbI\_-]ww}#Dn/ZH@Ɩ{ٹ1C %=3%^(1Vy} ՝K/LyaXmǺl,"ȉTOjb*~";0FAD(VqقUAۤ ])( Ιc 1 H!uTjΐ(D`1:*%I8)?+&9m6ݠ-twB*F{]'_DF 5,=||L uҵ= tV1}6@rRCUW[X4vvyvs8||snF:C^K d}2%Hcq&yMҦ9tCj(qu/$>UN 5So)/OoU}lv%v/Z.2Xu}9{W~%!ˡ/G;tҷƇ6l#Pwr#nt qz!H@y2HuHģ1>='L60ՠ`^t^ ٠:1NT :]X 9B ; %M8H5(qoq wIa&/X]=z00zw},Ho=<^i%'nS%l Ѫȏ S p3ZȎas?;~shIXv9@6 PNb*!K:ɀm <it1J2/Fv6&$S$xܳUJ?)ߖG ?ntXama|yuN.7<TeA羲*7W!F[IuVa QK-@"Y\2/_Y<Cso fr?V%g}/VC2X0K0>O3|ݡ_o=@k}Y8s0簨l* ?*[F=$qE?^GMʆZM#F_tVqO?=Nc(ws}k;s͓Ÿ5 C=Td!OXdjtMo}ku;¸InU?]lu6xtut:/;ndoDXIz#Z.Vy_^nt3#{u%iG}$ϝrf#_v1YfϯZ0S2m(/Pƒɫ4%@P·T:%%EA{Si[~7sMkӖ[b7U=篬 !-mSi׆QҜe`;jJ>b PgQL.50թ=#כ=Wv±}`x4cf5+4J'#_K/s׋ZqiKY(dP$H(@V B9ih\j!Ǐ[W.#'&)y9+ڗe#HFTϑ+W :4ugWLb~t~06jCY-Ay"H(Hdv!# R+MR:@g6Thm9 -h2dB=Yȸ =eG^U4cA"[7n\UmhD ( nP:"ҝE3|F)>X150VVdu)ٌN+na>7ie=bQy'=Bs407h'ZX1{]'V(Xx5* fn!(}1(dAa-o,rZ/Ĩv(\q9[g eCc=k6=׫56AcayƒX O6N`a'(ĆX6~R*Pa_X1fȲΈ<ƙ@# UDMAEg pQ8AfP+^Oq' Y}eY\ﰔ-I)d3Ex-uՓhb8FL .#-e1h5h!ɨ ɸ$$H;fIG+òe32iT- 0H2}J0b";{DHx>`>%qOhRiXe0)l!#$= j"Q < >V%'xBQ[+?PXa *Kbl`bf19. ǤO^ OID28 g;^c>]=rRҕ| EhCi0.$4IBKb +HO1CD9ZMbhmr.F 2JΙL{fPjO8U;5y7ui+4*p1#Ϊܣ* U?J f $7y >/0L>ⷧ0-~^=t54>Lf򄒛ȓ Ϊ{gJd  qDoz$Q)^uo:MW/~[RgrtLdym5ϛ pËE89giWCPj웍dJW-kvӓG5\`ڋ{!W?qaeM+.IJ5欺3ejѼM̓on 8nArݬ- ׋;k`lHm 摮 ##wL6%AL85{TI̮;W =՟혜j먂u5mnĕGXOiUFkS:\t.W?$v?~z{}wο9_ߜ;:a,ipk5J/"7??m M㭆Z6g\}ƵmNy˸hƠǵnmHʳ 8WͩG9lњd *0Ui"IG\vPTifnBJb C< yohv>_5$4쓐s Nƌ3otVF+%,#CJNՖw7NVt䡯F!H ZHm T"̑"gLڙr_%ó;i\B_[/$);}νI yaAI<+i.a0T-$h{q< uA,,]>'D3O5weODC< -VWU*vq_Eg8<"BY͝Gx0k 齾hu`5r5DꠂP"D`wdev9`-d%)g1JWFk4* .AB霹"S3Ypk%FavC`VQÛ!vt;*nCBfy^y~,nlz/$uʡz'_u&@ hD'|m1p]1=);'~SV2xdFh]TDoI,J!s}tQ%[#je6rW-2Cp,%g2'D}%n˩K &Z#AXme_A aMLLZO[CNPf{#vUM/0vX[CXRUo,zt&tW x2ā+9Orj Nʳ||1 ϕV$jõ!zYH&IS-ǧ|籍U )UeYDT7/xj:%0 )Xh6dHzh^v[(}z4_/82'%+1@H)g2+*"{˺D(eU^:Qg{lu^]ay4rٝag {mLqCb:DoU?1q8\79.dgKV{?^V}F׻p7[jz]OgxUǓTle--kϮo %] ؗȝMG$ͥzY{K6ӿŶu3Һ^=-pH6/۹MԲ[v>㝗]b=zr7?8vwy y/7⫏t \>֬w$Yj&-af<̺0OF[~B6 d]˽^r/t ]/(a !dw^r/t ].B{˽^r/t-KÔC ~\>¤KRd-]5[BW,eMFit Y$v[5%[0~W{d~rfjĎvS^\2ݸLjbkh$h2ȦB& 9N2`NZd%2D6 >޼||txS8]3a^Ƅ&m& )TA*vZ0V2k'$As[;hmgۍAu#="q!y4 l+[zg*WS['S1IzQ@'T(O4²ȹ`MTZ" E{&1c^i9hR hca@p Frm/wk<zL"e+VI,+7RQd>z?ї;*J (`*bG<:zgNf5-R8,E-Ld.2HҮ6E×ˍz}"ʒ2EwBT#(H&yur8FcZZq6NW?{Dt:cLCQ:kl0! Sm&9)2*;c3M{OI{L{D{qt1G )!-y֎(-RE t4#FS2(tP(N72!_Jʏ_k,T- _ >(ǐRIT45I£BZDFf Yڂģ=Xu~c8!Z5:AK{=}5&QgU>. HM^(D(˯J@NNRJoY+催mX3.b䑅FACAϭ'UXgxf*a6RsܽA F@p*nu}ٚQ*+a,I0p uhodK21B2XQi[MʿڃɖrG?'qG1L2/0L>ⷧ0-~^=t54>L(\򄒛ȓ 'j}+qK@q0LƬwA%=?U\3z8p%u8+GaȜDf׶\ _iA78/\Y!s}^F>a|5?{k}j#W|;VzS {Usnm:r)JO~@ح5m# S#KMw׭~ l#Hf+j׬FT2.LDG{~V8{?56Np8-$TC9\U7iֺ651S1z2>}pΖL^3Ktm^ Zɸ s<<+NȪY9G:[5 yEbybȨ\eVH^zt=f3^p QuF]1uӣ<H>>w=JU=ZaX):Huzە N_~K~/)7߾]sUSXXmǃ&w-8|hCKRg ɸ#w Qh;jomɗ׽sg9ȇ-׻&jŝ *̿F6?k|g]4$QoFhB%򶼾$dnoғHc$)y BrH6 po#DNX!(*@PՓ0y;q2gw}U-HMޥRb`%@J1Hr?\ȹNuӝMCg%uyZuJجH}x߼ok%F kP Ќm]`vZىj [4#pi( 0' w@)QL$#<o1bQh^ Da4R(hdp\牖[i+,ŦAv\ޥf;uu\B [B5n5(z-)I:$% G}*Z/Z!ƩV^o%hz15>90뢴9pRsЮqK3d X"U9NrQh;9p!,d fG>ut*cs䃓09S's  DpD$p<ڈ(ν GS,΄IB+ĩL9JcKa % be`sǓyitIq+R]D}\D @Szb  `A`1xrj}iUoB8Jli/ 5crjPPC1hD<$)8:ō^*V<[trAw3K5* D# 2) 3'e ym7qq87zZMG }&PTZ' $ B<3.F= ?:xK)Kr1G)!1BX(-7Ҋ-65!7րĬ>|K>d;Eخ??gv)~wz)T7 o?/4צtug"X|x{iHNuj;Aw<7} 6{ D4ɩq{j/qsgt2f0#Oc?GH>O8ubwmwܘ(%{@՞f8gG_a0oؽrZ=?ԩ*|ӝ_?q>]_o'N^HPk+YEʦ"urڤdD7t'Æt?@+CѽǏ|W&G-gRc3le*yUo&xޢZUo )zš O|wozf)тM ]]?֢ܴlZKNz 5y9~,V6Pj:KB %KH^ߚ2I =]WPC?!M;|;%I5|~"DΦ2{S:]6oѸ?ctgn8km;]/ F+KjyۛsO$9o:3__Ȼǫ#kG^4w )퐉R7䏻qZ3Ľ,~(K6mSOP yNe+fp0SA]uh7Fy=L,׻g yJ(j͓AXY^W! ۄ͘wcPFS2ѣbw嚁*lo?OlkY FxCkc{ʟD-.xPHB+2$xϵSԀ,A ld%A}cC*t|VO5-#+"h]@0R"*Mt;EA&:P+& (#2tRx2x[ɻm0y M7C) DRPJ]SkRr]{R1LӚs"gfzеX+2[cz n8.1.XVsm)^v>SUUK<i$:Ks-NeRxVd(9r%Jd|㵠*D 2&֜J'Sh_tϨd΁$ h9rWNXɶ' 6xxٔY.Jfow0œ9.^Цw4'ߍT\IdB;{5%5K!+{60bIwqᛁiqF"_ pH&j!-dGGWG "khwH<<Fa I;΍s:#}Z܍esVI̧ګyDZM+6=3u 9h֮90͛$)lQb,,lbnU<2QK`HŃZEJ-kj3\sI#0 /HWD^J re<\͜pOWcWFD~FgDT7[j4`ZhPƴ!It9y"C7dsRR,PҎ#I(sơ/ROis5/9WM3.>':nZ#!@J,`Jr(4Rmwڋ.:+?cSռX< +aw=dΑw~q!䶌b.GsT(۸~O@]E{.*F A3 B+Yk Bp'\!x}X!S<ҖrՕ|O-TU +_&e?jG%o>vW5"!*ę]vrS'Sx@}x2'4gV8xq\׵|1"D\XLalk!Nk *BBuKi^=:vY'w<|1UΟOVnܺqku??HtV]C+qA4c$2']Z砷z*zH[_-5&nqXJhn*IY9MlxP+byUgBQqJk9S G%ee>IȷP9".k6uU̹:Ni*l*zb\'ۗbd쥒6xYj7Rf-%Y~ [}nVS@Y5@|Ij"1äB*V{)KJvRMI=L,=r5s>5@.ޏ+dWs% >#ΧqRsak:2$Ls^CBԌ(rt>7OAgczg ehAz 3;ܒпqM >͗z|tZ<v#0 uܚssK ѻs]qdɅղ0*g'wg˓trrq9kQ=zRh>v9Z5&~쎹ۿ?߲Jes6 >H sw M7etT@}ӏ'40E?lۋ&E kOpag/+}vӷ w4WW+*?wۗٲ7n!6\Ώܿ{-FoNkO:~L|/< =-盿Ovy/ W/!DDeB* Sڧq9>eM߱ffV.߾lc#ཾC{S'VKxڕCF9KKKy\Z F)GϓķX^A>bC{ A1SDښb(L̘: 箬!~خTa1ts^LjW;p5_خGB֪ =:)^W s~t_o>qNػ0sso~}\>pg.~)6>cs5/-ȂƜ"4uِʸ8)͎>|IlP+x+*mx ~}Y{~<@r՟d[kP<\X5&jz)Y<ۊb<wmTK̮Hat{.Ѣ (jۜw>J)i%7$jGS cgRg[h~n͝At<pÚ֮q֢kL_~ywJK{8 {c\] &[Or"g,K3猘k]j/eB a o^CFPQ{-!ެܵk pGqc_f齾]luɶF84⌱ڈx3ĎM;gK* AߧwCJEQIj)v= ǀ6Lj}GJ@`n#>G/rs|F}dbN=]\(Jᬡ~Qyqs;Jm4tJ{ 3 DVe!zE w:b;ӚxޕZ#VJ8:Cs| HW#KfH5]FcO6XZ5/*RbS$CvL(ۥSYAPt=k)!;x4 cm.؎iTZg$+.MTPLK&!3rMc9g`ƙ6p Ak!N8:]8w U piՃW}(ė% BBvRc;%c3e] 8FcHJd,>S\؎~nJՍ<bI +,c 2!E4!IyMcV,꛱b;0Mi &U|=Acagsdhh]FO-W0!ЮL~gb+#7Aw"x70cºo}B5% ȂA K6t٫3j |J]n-E(袱YHX5Mr00kl0FqT#AIeH(P fX `fD"aVA;<YH0J8I9nXBU{v+n4^vʺ@JѷMkQ 2!uٌo1($ Kd.RDǁ"Z,Kn8=a]j1"j>6b id@g} |_=b_j1뜑o(FD)ǍNB2/{fws?w9O׽np-Ww4v8mPU˚2w*<}XAwƮU꿉<͈M]tXq7Xk4"918;KdKbUmDe-`F^-a4Bb,9jXccM m)U9$+@Nkc/yBr 0@dLʀGf̱S\6uV qQ3&JK, P@ @!o*4\b,KҰqn7b  6 ?+lA Jl-^'L[ct՞EwiČ 4@ mrܭKo4^PiÇmP A}oDvd 0̌!_[#h/,1½gv ֋өR5˴v0;`\KA}DwP7߽# hFbH[p_caM~)l1Z<,F *Zl]W8%@Y)zry4||UiFri@!(!A/[|\*9 FFnsiՠDSiNLtp l )ci.fҸ-&`=z[P}f!@PA>ۉ[WH\k (\ B#:ɛ+Հ C '2V(H12$.=t#YǀY/`-йg`:)N+ci 48'E1@bxHj+͚ &;N=,Ԣv y\]ǑpUU ".l <T굻 a{@7_@pͭW W`ec E̫X&9 Xf^f΀|BX)n yS .j|d8Nj5NƼSaU*E XU:ӥ)`EY{A`2tAbIWU-0 YXr޷x<k PrH%@\ܕzBol6N`uoS˽N:uhx qjl4=%'N96; 4h sm8Ƚu'{=Ps mMq$BqlIp%ЯW㐲CHF6iXSfOtUS5T|q $(5@/ x]jd!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2^.tcb}B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L dX&e;"&P6G*õX@0Z+3(F&KdTB&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d &НzMémTRMˬ\&߽)zw hv{bG8 5qi> 8i ڍhk2.23BNI林X&ϲzUiյ.E*GxNۿA_UH/D`rwB{ !3M,ܻH ԧɞpQAҬZ>mx\· ^_x|xTkRӖC/x> %Jy&t2̏wwZ"儮(4Q(|ӫ/ ժ2u5dmW[z>U hI Ҫ _J DŽOZYn6\-Z}Yb(-Eal}"K\-.j9YV*%,`38&[!zrjEp9=jeNU\@`9W'X!f*X7ԫWzu|v*`R{4QS4dk<hJb^&ṣV |1i@=?Q+K-F`O7;b Ө/'GY71I>j2ìfGc@h&=jрGY XKPbyrR1bV-'Xe"{㬥u6W*hk I*(z`caM2 >Y`ׯk_\^ԷJ}}?pg]u/?zQ>/ŵRp+3Kэpkk~a:@v+I8'10]Ee'HC( Zt\͗ EYzE"zg W–n\!K)Ͷ=G#ίhʧŘe{].?ۻo T_j725{-J˥UoUH2,$A^(9ARʋt659FhJ53pt ]p;TΆXiV*`吔YmAR>B) :â Jt`pGeyPPYEY)}Jg2~utN2'Pgj}ރ=YVfO@Z[\!y\jQ|YpVf39_'*p" `lY^Fn HPtQpɭ~3 S;ov66L]~^ۯq|XkFUf;+U !54jjoY9ZAP#)ZJsqT7fRN 烱DI[6 tYT%U% e aA[㔕]d;ĭ $1LaF'Ì{ÌO&a|ZLR(R$(kZ8: ZuqeAΆp lt|`*gjt.q -P55 ceIsjv $Dž#4GtJPR NYyƹw ġL`*o'ۢAwgᴝMksYכ<ʃƠ#wA?l|!F[t H{1f|]b ̉*%*02EpjT#Пy~ {yHϚRw7LyG wmh'9ΩKQdIx){YSvO»͊(qg1+~](|lJ~PyÁ ٲY) %m} 776}Y^f_vň&"HM)g< `Ne\ϤA-)SH2-³X"˭a@ M>"crI=Β1lXPVBT 2e!0I mG oDj9vtUptv@iq[FW'4x=7:!CΡM9<EUR&`s EtcxyDӳ,k@MC-MkypQFYsN(GW:BӉ)p݅aw1_rgjպDC$&"wVyX)B% bT Gp*otXdāΕQ֛b? =4G=xа eHgy'\kWɍRr¨>'۾۲\5)|?~]$UucNp9a%${v>Ζ-z]/W@Vz|]¾H{go;ۿ>5ۆZH(߮e޴ _Y}= EvXWK=]/)c5]ї(BsK&Χ%w{eP#߀-:|9rOPT~0ZV`:6q7m{,JS7]{>sKްѲme/"}#8kZDcDH5%:)sV٫d8 AeRdHL'2ח9R^9S.`Rz™P Wbq9y! (yL[igͱh@7FH`:C\ޙlU&˟F:Қx2 cZBgcLaj2tTi6[vՌ鬋*{xJ*Cl]:0BYX]PwuU Ռ,`7M@d&*RD@PNxBspnnVGc~r+yvFp6{ : ZMV4 w:w9St`J]E^2W+PpTu]?-ߕ "ikk]Ӓ oQTИW˯O-a~{`k3mW{2f64v7?}[XSezWߧ'8Os+iן)y|OT..M7lޞ/X5y!vﻠDW$X]LelV!HJ:4KUBi!Nle`7U \X2cMJYmHAtJ.%%33xCmJ p^bpP!eڣՙYJ{ڝ gG^}@^y \ҌZCTzn@xYd: @L5K%iHYN` 4 ېd6>KZ>X! JAkMLXQ;ÊM5̒$%!CDQIĂI8YF}.Hl4Nh<_^16y/ooHx&i)8dtJ}-MOgӰ4P%?H#[Nqzpbpzsu>u1I%5(M \ĵeܵP!r_DFۿ?Yz7XlKtr2:bQSXჶUxBw>R^? w=?e8 7,n>r$%%-^lG9kY ~8)p //SܳhPpA;^_qB >l&,o,Ϗ1BoUu%s o+Kn=oI|"arŭyvFdПObz,D7bn>Oޮv>5&lFd~o*=x~ښ;⹸#7TG%Lk.ygLJ}5_UC#mii[ca-*5-n^5tV5ܔ. pp"YߗCm]U`68܌[a?odo~+Cvϵ:,VG ΁1>lb >i[3bVzc5Q!ЛB0^`4)6 >^qX& w6ՂMW?7O>xMn{a۞M¯Ѵr k 8жj3!-k !-d(k|#n9y|+.-M 6^zd!K:u}U!cxU^r$RD) Iapx1&霹 Ѡe\IhȱZuC `)mL(1l ]wm[v{X=SBT[z"kIRՏ8 k`oRøHd6G/z EB ewڈD7Q$dt噕aXdˠSB Pj 2imsuϘ eZJ\5m76 gg9ki-a-{CɥgMT.%N9O)Zry$^ "db,?4iz*Ӵa)"70(F%)1Y!8f*4B^ A r(qBB:2NZB)ICZ3UB= +#=J_4$?>7Q7*t=|lGܥ4"]x_.*0r!U=7m%?+\@^.S8}l#TӧkbhgiqVzuz}yuQ*SO^Wђ?|4q…V6Mސ(hgPwiq`~y яmo~hx}s}:8@v/oq /{ vo$jo 7^m(m[I+ܼee`3[ĘƓVG:Ead<!*Umm ٮ:+𴔑$~2 73_yP"&`==*o6Ͽ۳o_o?pg\7ysdu~O;pB9FnJ?=xq4\ b5zkj>u7حC; + 7ǿgtl ўzÖ[I^񰀮qBd_·gEKRL߾p[bC*lh$< B߸F&좀 J%:|΂TKF9EeMKɐٟT9Tj<{O|Q{UOyѲW}~W%47k܈9BOԒԋU0t7yє*0O쵖SТƗ[.`Nd)BIM0˘KA$xlbL@Lc֝Hކ Y a["+"H!lcBkC\3Ȣ28F ̿Zfhv>x3#μڣĎ37f}|Gl{LG1-M:F? ctHaظDY .:WM9;uw2y7r&&zP!Bh8JͥT*s J%'=pͼ/E߷n?.D} YӫÂ=oNwziv q7⫖^O ƿLx yLa*6NӑLW.Us*_wV%˹qIVg>qM䚼R 3TYoY_N~y+_}g9+zo})xJXp%2.yBF!җo8haXDBƨ歽hSI*{e@>oY{&Zε) OF㸪.)@/ü(4Z }^L;oMYEIL:+2,IbG)]Q'4Hp3ARF m0jw; Ͷݪ,gVl3&#VbBde.EmXOŅ'; #zJdLֵeG#'n)3 +Li@ 4L6)iOrV={|eD;J+ޙ,=m`ڨc1Hgf~YL$L*ۋ@MBQ1$(#(Ny@)Q#C ނщ&0zU jQ'~ҽrڮуU^kfjB`6ZC(tV:LrZ\$U j+*SEx+)>fC9% Mұ`0fh"By *9K}DbDh+WC=UcCWe{}P,>h9ј:j \`ȗ91mO6:4mLkHQOBs0kNq3# LGGlʹNw\#F_> "F5"5F, >ՏJt!PCۉ4-#$2kxiїOY0 2dA$}.f'l#0Z2M*s/@Fg=2r蔕 W"r= &<־ >͇NvܩSb0s bzLxUK8o'w5nRtC<4($AZ2SzRj”KU$$"T֥du8 Q(zSVгkǧV4:z*㘳lr0>F鎇TnU|O#Թjrf+i6W %"Bo228"Rydۣ2;iSUE swQ[]1H.x9Ш]48_r~:ufq5o]tIEn/^]'lfbn!-?Lnڡ)ۥ*GWeƽ~`_԰Jט qa GC/NeۅxΖj40_m|G;9YnFnj/$ nqfu|/M;Dm GT8ȹ64^wEvm9ܲbg~:a~ó=g']}/X۫mOvǶwD^q- Խ\z|ad2O?|ƽQगMS_RQq9e7lr!} e|Hs1ȱz%gj3K.gZ0ʗ %~ %SA|My2Q+eYbdmvcWKkm^RnSKbLC?YsKa0^T#k=۶;9R|nTwAYQ3[i8!ί@|:6Jm$ [\ye52r~ۭ aIMB ϶|[C9^6t*+{[F<=d h๥=(<ɝckMNv?w|-s_i }W҄rܴ]&>tfşbD۬4ŽC@I4nz4W:۰ H2;(ˍuGNlsruYtcS +ō]eaH;SYvsoDB}_ۏxj-Szv~,vk?|v{"p/vY?n}&㼼I|D}PT1/H.||-q:Jzm̔7 ,z0j j[2z(ZálMHΪqu8.$NX]۷_{0x,IoUֵ;KətQH-a7ycbkQvuRlIs4t]=^jIfUktNZBnC>\M֌E[E]Uu_`tlƢ}?~.o]U8kMԱzgvL*k5SmN1R (9&U͵`0&jڠ5m:MOZ 6FKwuy| I5K3uZysɚm)rӔR=FI_oBuɌa8 o\\D5wǬ*=FS $]1ڪTwAGQ+m݄zsaxXMѵ5|'"8:AOkc0m€Ҕ8;4vZc1vּXl)6>]ЇdTW)鞺50sSgEʥ:itAڳZnB8(ڏC4JkCvYw-`#mTȗBi;:cH9 O`jb {0O4uuh;g۴l(*gu%JPyUvѕ Xtytp Օ)c1uh00ձ;udzG\5m *AeØ4VGRga=g5`UhSelkGU((JU&A'Sʔ\vl \:Wlx&DXĀveBNF^I:i[`^k2ymB AvEsze] R uWh%W8e2a̷Q9! Ex&T4b]UtFtQl<$褱#n nQ!6K)ȗGC%ǀ:c TO մrPfdcBQ AP{X$xԌpi7i,Y{Gqc eՆ?ebU ]l8uZ2qqPBY3swZ9 TUqŬSDr|bLQb i5 !Ą9!~06s;æt]>pxzqi]a!"ja& J ƛC G\Gy:)HCҕj":XSQdQ|ѬƢ A9AJ# Il&@Nk f)dCaFƒ%$DGe Y{EB2:@q! oEUgUS Yߨ"|U# d-PBpߩicM0Iv? 6h!/fsP6DRD —9t %ӳQ@R@"TeP1a5Kp-ѧ -I+y r0]C2|B U`-DU3rBSoվgbf#KPaj*9d8NԳ֞f+JOPCXe(6tqHA5󻎷4zݬ>j"VN˥!1ˡ옕G EH2bMӡ %;)jI-0 YFຎ AyG58 N) c~XӺo.ʦdTL۲^:ulKKcCܗKޣ8}/WޏHj3.~28[:r8fM۵cK{4E7\v\2;.݁)@0?!FdH8GOJI =@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B}$YZ?j{/tz]]\_Phwi/|}tKq8?[  Ӵyn]Ǜ3Ep>_j1ۮ\źp}6^}{~ÿ?x[*˚zǧtx[p05\_4ʛ%F -N'킦h m|V 8|8}:>tX"%_)sA}tg!aQK[N 1#W?Ml"{?7 ճ.9/i2AAaxe @ıRA My,ts=9< 2p&j|q(x{qp_kmH0]`GwW ^9H6A{6~ZP' pgOI$'~3:TOPJg UfJ:je?lvjHfCŨT&9B?D1ְH\HpT b@kUD@99DSG-Cbd)rȴ*0h~4>-,u6?~rzȎof?to;g1L#H)J^ȝϬYmdǥtu` |X,!V(H`Y(TG_>Çtv`}8~%U[Gmd*\ëGbx=%WeH\&!Ac#qs(gw=O9A4/RYF艻ZNMh/ڄ=BӃ@bPaӧS?Ϥz)&qqcZxA)I/Ȋigʦ5A^ f5{b s&K[]֙NyqAҞ0:¨wSF{Oܩ,lAr^G\A* [E0&$C֤i4tq^o0Wmh0lp`fV߯ `˶#)bAѢ|ӲXGssOG=z>uNK睋^1A6?W qMޅV/z*: po^|P^mu!VjVK/!V -GKuK_ũ)NfE4|~tӟrnn(o>MWL1&Q";ZQ!hD&MYhJA.ؒj%1ɝߧ$ nkTJ&qJD}42CbƙMT.$9MSiP#gO9+/@JF~Ũ)di!O+93&䔤"r㉙mXTŁȥ:4 z_ARYwDT^H@Y'P<:.21-j jIZ;i.j8iqagTey;[Kӻ swws\! <]`:#LRrZJH*S*zkW!c'ӛ0bޘuo`g=O3cV7 'd=:k'~G}u y8M;O i)O2DqR$W= Igȃ@ 74bi#MG\BOEcc6,|}P,.y̥ št68-R$n"4*V<9+T19V:XF't7Vo]ln_)W9؅MT=avҡ(\ͽKmrㇵ`~-aG׼f óo6-UƃMVbwתd7⹥)69YĂP!zsmJc=b ?TP@e%D!& aWXOz V1=s+H2խYTVMҦȹjSrү {eYVZSt;I'`#& Mbc-$agIX2"&=2ܗ[ Py, S*|y \CV’1C1HsQH("IfDJֶ,Sڭ驶@ ^qFYҊ[b"1D$t`#[hYA:GxrrK+=c ѵRJc EGɭ:rϏI׏%_ϳ8= =zfDҩ~юr1\Ow:/)N:䣘(&9վ\*TV><ƝկWw] 'ϥL-{w<&&8;U:PaF{:a4S0J('솜}5)gv<׮{6>}SfngV|iqzB1p''\ *Z- 8!=Gt(&Ua5q>;m1:kRԳ3짺Rw'7 PFQv9kem3Ž 8\@sYSvz=MsmtrUp 6cY7HާjnY풃Ym4++;On?lI= mݰZf(!̳w;uocp*EWFv:^s2=~:3*y|7ّ1ɞ.=<ʻzlT]*T\!;_ח|o_R|5:D6fX~~!spa%kko5G׊49z~7,vҠZ ;?L=~ΎS|ز5b> (Uoou%U߭S^ mǘeGy[^1vHh ;I U `,mS+}^WnL eGB (l"p$\=F*B:Hm~\1GԦt"yh W@Dx&#R\KUfsL&8! 6A9{Gf4>Pwͨ&~'fhk ;\U{SCG5?zƭ3rCR 5J-WVO@rAz<O,Y YTc\XFz $K T[BBBȹ_7/@G$m;?{['B#4*weL%y^w<ؒ֒|$Kr]>mMɝڸx:(Z;Y:k3Y)XG2k_gbۨxHDo7YAۖ<]/ kF[ {)g #YC >\T0 Sh&WÁi&[4ܫ6o$/Oߔj¤)li <&Nh"n"?D'TD._.Q/r晊.sTGeӹe(Z"?E&#*ꉢ{ED4KA 1%Nf}$ʅxAekㄡ>PBD"L9BI ߭hӾȹ}.n="_{Q)D<#2XI$:4/DzF4Xkĉ b݅ $)+d`w!G%suZp>XQIr"ʈY8Z*1ь!hKഺ u= $Q>+5RG!Yss1~D&@'k&랒ʣ-s741KV[20%!$@}*Z2T2vB26Uj|A־∆=7yvfgP|"wuwUҏefrW$ry5{|?N?ԟ~ fGssS>Lssw*fZ$\.~g{yaNh?ժMRiY\Y=BOD2Z PAd")!Ȩ9em|C.S%\NTOp}jIJ  W)eH@!piB.S `C @G5!&ji.j_Ab> F:Cdցkة9#O(W5>wG%yp墉- uz{7\>YQ,Z[M wT BaJ87*qmd\ʸ4'q8FM(2i;iE#hYJJ#nCl"d (b@7B4*QǸ 8$Z<eUF2a9{@J/24!yjƳVmnc_ga;|rԤzUSaeg«#mCR ܿ FZ7\ *W 13JJUtF~n?<2BD ?BA j|&2JGK8v.fMx˨Te>GSxn-y}=׭O67/Rғ`.1e ʰ1'lx|QBsM-*) QLJ sQ}qERR .NX99赗;kH)EԒҩb'h Ih(M&Kz;>iWY ,\\uVlJuV;"m~ Lt[@ }q{bx Ckp:)o$8}wٻX]UmaJc.m骜=-[<}oy=%%6{Q[헶ѕx[iO^R1-8Zy)k?-oƿfR!rFK*Y0R`D$\}O^EB" %$JI"E+BN#\SD n82epg*(yͱLqL<DcK$%\2:7Db΂wDðqVنh8VUU7>Mq PcjI8W}mkKRf)ta8,Ex%ZFP+)PLtZ+7❳v't4z_nŕJ*bW45o^H%xA%6M̑X1JJ Ja 3^]{sA~Ֆ!;ԎsN' (b< ZgPrКT!iQ>|1 yM6{xNw>6}-fA2Dq1%4#FD@zsLH %FHr=fBI=yBRO-َՆ#a%.DzCZCZe`=wN3gWzw;$Ъ/Rw{kZlfn-?L|)䭓 Sck]"iKna0Oli5f4. ܷCФnԷgf)||s5+՜R>Wʟ6[4flɺ ڧ]7>+Ҫ[t~]p$zi,?P dS\x4x0a}B8 /U\\z|<qu1X,Y/8:+*fW-h@ڨP}i`683<ހͤG֎I3¼Y4w"^v9=Z&k)FE@7#kUYX0_J򳘏A=ϻ#iGu}]+#2l+&{0Ii^{w)́Ǵ(J11 нa.n\%-W+lU͖>WI0nh^739ݼ܇h &`by I$BE"9JcJ34AS@{KLzsdCye:>q/izP"J|B** ;8E9+lDN>i@KEyTE ^WCѾ?sRgGU8EȞGLPĴF犘T 0[crBx#;Z2L0\v.鳼QG]QwqYp7:Q,9tL ˀ9HPtYcCf淺Yx?bGkSdi(jXà9K݄E QQCQVZ#\n35Y꼁ܳ.ݾ~zf]_Fw8evz:ۧJg ǿbP7_|8FDN˒J(9l 9,Dc*DJ({%ªY2j J) oƼLjT7.%qi$ʛB`B<UX- ZFL&Z iIنº=jtcAtandXm0\b*IVˑ4FJ\Kq4JV@.pPt!,,I(J1,,VJ\慚m8j'kIд%G_5{ª3}O_R<g˯ow0œ;rshS{,DC|՘rvs]@Ggre5u( `c 3!h9i\Q,4ժKLVdJFSWMbS\-&x#32x5sf"Tnd6Ș6%{b!yp_ߟL3nE@wMMf8ofH%%4X~>D p4D U<4v #G}Z؊FӐ=AP` yI4 t;t$LEt!xܵs g;bۺK*池v6ifԦ=j vs`P@ t8!h )cQ)ӪRy L+m$Pa&:80$ˁb#r4H$G(XT23fv}4▱兿!9(霭 Œ9K˭.#i!V{Cx. O!̭R3#gk`kΧ4B!gh՞k:FY^1Tk%iYWRqy>|.} ĥD6:ZhʔӇa[>^|`wiLvJz_Ry˽<gKd2$`w 3/W硖wNLTTOrv^(-q8x|O(TR81Q}("ZѲXGDNіP$X;Ȁ(*pa2<(gƀ>Bc=6-OhXɥX9&Rg+Y< fg6b}`,=]wJ-_:C4coal:T6cؠq|:Y$ONy:]\p'BZ@MIAjSOkik3fe55+>u>i&\UqO޷~1T?HOSnоSK>Q z:kx¥Ə # Ry i-%/1XĿͧ#Yx v|у0~u{q< t򳓙6ۃۇT a^Ү'v[r^S=s]C֎];lơ yѕ#ԚiS!b6_WOYw9Һon7M(0yfyw'2GwFFNfmxE͝VXtCh8|EɖYkk-m/di#rCUL5wDK ؎xM\#IQI)%3ZXdc&zG@ po uUOS3>k8R C!fH \9HysQ bj#{yp8>ͥvsf3LݘE')J;5 _Q׺hLm%;S% 嚡mQWs9|G ({](9L0+'0~sѥgZ{8_!)r=nv&_6^QCA{Non Ly,zӯ IJzⱐxhY b|j&n0vך/X6oW/ޫrw7(h6sqy~ܴ?[}]񑾫Չkw^n\>|3fi_UWǪY2RԴ,>kgi$wIvSܹ/xv$Ϛ@O٩Ʊur(΀:Ң *"E$'{s81ʃwLYɻY!{(0oqY50ͳ&j* WYrgCD68 N!^|0S z7zXSg͋ҝ[7s< ߯qG^|'?.>YF~y^`nB~Y{ }k7)4+f^\+zKɦ|dv} /W_¸z+:8{VB}OvU_V/6xZowJ񢯥^|&$bptX^,ί;0kϵ}hW 2Vj Qb>mj+{d}S;y>J-dn߻ar[V9?#7kú`Ue2, ɕO23pP5߭K;OS>'`!`kdճRQo>GJuIIVTeVfkQ,&uOTx4ì%Zƾ y 'g1U-,D͖@շ`79B4Zu?#>$MPd1Q%RL$}բ$PƬES**kwؘ{S-"#ݓ=NkS<9C{ZCдo gTSԚ1$kFH)LBSS%*D 9X $3ؔ LsSV4>)2eQ}h9(+מ`-H?sL Y!)!d*ae4]*4)hR!TT2Wd.C~gO*ciͻCsŘW}j.XD! { ̺@wȢ>NcYxc*lC QP%C(d[5u^8?͆ gUNȪs#QDE+Qx[4h֢ȵ STBfLxh֑F笇9H01DDk09G[GWhs2_G8e F`4VؔQRDVhr  IkSm'M M|VYb9kOeM*\ J{+[lڕ@ƨQ1K"eX ufM(tGhU7W*KIoTJR/q*)c!rNc%'[Q@QdVV` Bz`m^jvVi። 5 <(EJ<Īɖ>GX4&FjQQpdPY]urQr2Po,z6 0kj@ązPKҨh)dBUY4VLtaXrѼu />$/e'76t l*^X:7+@Jyj@"¶M fBNAOCt/NO؉[y's8-`I%d_ \{#B~t)]LLAjx1+ 7D. .B Z4@ @>R{]$\`tBtXY1hK&VȀN ЋH)YE~]icmBg"0ӟ#7J!2~\)(Ƃ5բ=Ƃy(?+ >&}LY%,h-dl9MJ6L{3~#$JYxf@mPJj"*^ `NIc-Jw+ a:H؀"`>okٻ&)`6Mmwvӗ 'Cuu 0..Ogu[smg&E)nnf= kpS:)(r2h:Fm*5fZRAΣDjaK7fS^FŴ7L)gE9)1n>#F@bs.e֤D˱T@=@e!%8T ]ڠGJHY@qoޮɰfP>Ǭn7؊DRe 4IUU;I Z7yˈ apaO1 :-FjJ xT'#!Ssa^v ϬP-0'`ҥCT.?QNڙZgdn4 ǚ*Ei H>{j_3NFR!W0-Lim} ZZ-[io>›)Ơ@P  Ƭsk6VP0k"(ˤ #f@hD2.$LvP$KF9+T5]@=GPROBU׈ LCicYc6FJۥ>t`eSFwL‰~ڶ5])!0ts7L\ХQk\OMG!wTu)Xo$r^}#볶ڋkGQV)SZuoASw jxJ?\}#!fP~\UuAް:jsNӸ7]w>ĥUV")F P@VIܕ@JY 9*8X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@|@cI d` ԭ+P%+>K%^{V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+>_%G}p@0׆Q@ȉb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J ' E@0t8%P7׊CQukJ Xic%"4+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zNմz^_o_oƝk-վ-F2u3gGS.ʋog>ˣxtV9p=}u|*'3I^~y-݀/^ ' Y..sLSlh8pڞ2. ѵ{D6X/Ά=Fͭh}\MZo$[>ɇi |_G{EN^w6@>ԆS4)"r\ͰoQlOI!k=܍CTFo(ȜTUB( .ݠOrYzٴJ}P%{;Wr?޲]֝Y2R!MԴ@Mwuݕ*H W~K7/\r|x1}fǎmz6y N@Q~sU|ո K-UD6`ZAԪZ!ѕ&lonZz"?ߧr;5xٍ5bۣ-h%>aNFoH_16/t;-훫ÛUSfi(@MGULj?ݏۗ].qӉvг#P~F`8owܵS./z=X{E>"08H1Cw-c ~M`{P^" ?cl0_@d/ 9Î$A+yFVެNݣ=u{3|hhg NńEX6]n]NnQH˩B_ ky@;޼½]c:ݎ/cN1a&\U.GzVmҞb{w[)605ֵ"銒gHJlQQ$DDG!y3sxt2ϫ7]~my 9Qlm4G~;(PXW,\r+'nԷL_;w1| `M5d8+p+ihikS,FieøC`s^lnrs_1̐{}l~ߌ'wǟfl.J⦝1A=U\\'zиڼwf8ob^ev]1vޥ/~6U'fv L jmN]0-X./<"nqW357f>[,[̪mep[6hX4jA{㛃2a'M"[?1pJ 'TrTQR!3ZZJL)0">}O2Q'߈Np-?Z?a[t?͆&؉8 tT885ETznq48fY4Y'"JfD$d89ne8/<j Lf[8RGbݳ_ .#|2tg¥,sV܄Βs1o: !2Fr9N1K(VR80锵V-7KݹlLrٯ7b=C{uP:[mo^HT#9 z搜^l3?&6*$CP ʠ ;J!m\sʏ6M+~m eH7p眺+vbcp"O j!՛*+;Q$#;6츶5.zΉ*?FCV` {z3s|$[{oME6?!EFO !9.1::FD@9H(`xc&$l5z`x`zi6O؂9 A[mH89B(X.!2Ҳ##GSYmL]wmn{o5v@ݴѲ= ~*F[KYdp.FJyNpI,!JB#P;}Ԣg9C"C$R[ Bt0L|)'n=݄NޞY xjO6l[p7A[:Wߛ0+~ ޙ sߓʹFU˛~mX?0K=bmSˮo>떝_=ׯ~j7+&:Ocmϛ%9ŵΦUOj8&/SEBk;]==\x2Ni7yݺ_l2(0>wSviHWqF͝ñtc],/z&= Z҄=?*X Dx8Z~\Y~*\EqX }w_lSKe-W2REtG`ѠŬ6gz5y2:q(m0)ǔx= $Uhqa8yS|Ž]mzTL~I΃^7]_sj|-%j~7kYn07-B7'k/k=Cikkj?֏x%ؓ}Xŷ)wlTx_#K/3j[U|;bm[pÿ{'Ɩh@UP˾6)o< ,:OS8nF5 G;{B,e1]6uqegNRBD+NFWߵ+Aq3 o4ozR54/iۇԶ)Չǔ.Ǎ]x]O1V;4W{ls+]t6m`nxH@(J;~8qV{UdOgmiB3QUE#alj=aœm'oeMApc#8gU3'5kuP01μH${'1FHI瘲k 5j d Wg9̡L2|Ӭ|Hs{[$qqTMjCiNՑ++u"$#[D W*21X8$m-I z]|[Va>:SClc# yo.E{Nݴ.$uxyK3³Ǫ鵩V=(u_iUv:Gx:RZ:8ԯ~,Z1݆ȌC)`H9yUԖR K -vT>t~׹zGmZE>K%b%A82Mb#NT'(˭rx,x,2`1aVG"5XC>}dY#v Q;%H'èHSG1V\sڄE QQCQVZ#l4؆u5y9\/{V=}ԩuK]_HJ$ͮ]-0JdFfpK~g 0%rZ,8U*Eada!B)X87J֣ƒ_jD dZ` ER ț 7IM@@q)H#P2(:oo 0pXX2b=6MVHP=?f΁u'd%8{vxJiRFG"`\f*IVˑ4FJ|Fh0[90A x@.pCvYpI(J1,,VJ\fG6Β If)RsyÚN}ê3a>[l6sYm.&=oiwH?p]NT QLPEALTa& !'0+*e*BĒ:h-J; |4rK%bdF/cLʍنq|&,H,$8/XxU "1ls. tw[}|TuAiFhXͿp6R@A *,x 4 χ#AC 2ZEq)Lc0rgElhdp3䯵TH'Kaґ0х2#vGl;5y.:Emi=90/`f<2:8:yK`gZi#ˊLtVq`H3 Fh5> 'H Q>rTE g?֢~4 b68YfDd="T)OZ,%܀]+aMP1(CN"`aSI!IVb4S'PGJo #E``rY !ԔYf*:yG8 VJ ʸhB K5 Abpgahh\3Dt7ǔ/ K$ zSA!d s4C]jGx̽P`Āfs~\]$Mg#IrT*vJ#u ,KeVZD09 ++?:0=J, ?YpqD.=H,!:!KƜ"ct4%usL#L(r`QjEw9lV w`VD h]4p:@rϩ)=^t%c0|"6]T2߈H<@Y,C2+I 3yT+WwuۚàӰtN].p| zK:sdLry B(3f4j|sWL#,8ZI}'M"]8r[n%>~vzM$ꋻ0..;@u EQlnr}ۇ7xr7 ;-n k>Pl nz-ڽH vI"aLb7],p Q#K$~;G-CI"ٛr 7n*]Ϝ SWB ~Zd9i_ޜ~yr6dS*ֱr>Ϧqce>fP8Z^󳓩O TM7dU_?C%:DF ~b0&vd<~nST֛*~v?-W~g.DgoX!zh!tEpEEK_Q=KrRJY~1ayzӰ➥ۮ4rKBf}[Sjx܌Olq|Ā =Nw9RM_lY{6]r "w6G3_͚MV? xb9ڸY}- ?ݎ>,3OvgѼK?Kn{ڞ4Texz2)[AtÜtD+:zK4ZЀ'ېOő}IQ;>@eFZGS;Kf׈RF!! BzYtQNbARV>`C݄TgW*_G77fe#gBZ$VOnK[r>亀[ѤiiޫkMh| @'osk}P.|r1Er&K0Nc!l G5y^4Q:0B?Qxw'ϦÏK"{nYvM3taSWIӧ!zBQȨQ-L.hz?uBAX-2uIQx;IƘ'IJKu1=^)e=(u,s߈>9-5+_ Ewe<c] a\G%9KR)Bn,ldjBѺauQ;䓧1xO5%;q_G2k_<_ѬNNqhNj7.yid\^J)2삏a,SlSe lR+id|pg,&a=.䅳sR;h{Lm0RV1 XRZ4`%6D94kFnOZ mrciiDfs`D()IP?)QJԀ.3 ̷.h9cX%XZg 62._9㪺R2jI2j/ÒIH$d^1Ndh) 4 na$^GȑEu5{/1| q:'3PX':뙕r1)BHJwx5* fVCzQU=Ǘ_vt%%Q44Pn0`nTsQ@錾G=Jpn%ߋ/ $"L!12BJz[^`NA,I nu}ɡ˗wCr/6=99޲,ZtҎ3EM*^l ȯP*I"s ^78:1ԸwW,OE[61zAt1HPDBFKD,jS%1JAwWyl$Z\@j:KXڇ  %oyaE. eΉv2'ö*2"RA),?4jz*մ9D)E7 R̪8]NVUwX%e5`/H ҽ|~BvQ[RѵK t6G rț}~ʘ=~qu??4|I0Xp>A9vp>'$ʯide >~UafpհE﯑<-Eǩ-|0~Y- GNZ3qtG~6˧ah6M(Y~<0gHA(D> (\l7kkr'a|^\m=]Yor8z?O3n$Yjj^zu=X<^kuS}w_/x n Sp\FÓt1f~QBlɈ=mݨd7,)'U,LƣO]9<w N/ث{] r[ƪv2=8JFjdž篓iMx* >Gr0Y\󼲏ըe碱x 3;{|m?{oI^woy9քHؘk]ڨV]k=Q/'|~!o ø֭iv_7Y۟V6Q-Za]uAZ}f1?]VqDK_ReXx-CtT6&d'H $U|R:'O B8rOl)䍑:`CʚNzjj;'+a:βաCdOlZRi4y#U0H*UJN\3u:x39z΋?se ?UQ>CTm$YcGw#WpˈŘ┲Ʊ&, RLVdBB dMVATAd-`#lbe,0V" cQ!8Ǹ ]T3rnGWwCJ+[s͌9l;Jx߿\:wkKoV=y&ÀqN-_De[pET0*6 &LacיGM'$һ-_9fPN݄&ieV1B H@tJB ݖ^Β" ]{'iGlɘ()3T6kTš>LɮtEnY>ӁύA179DrC4Ć*>;/\R"dk\ 1 yxˈ^y(/㿒X9^ҡ6fc$k8.vӅMyTa$| A)\Ti2hD,[W0SqVo1S{俅edlz 3yZ@ovqBX.~H7=D1d^&ls5a74iCꩂɂA1'c02%`-iTlR @6pAm$( e!;P[ D &&Em"c {:#g_PEmiG l17/6.wkIGN3EHoU=:z&tݟk7fqc]~]cGlc⠠RCrZ}-Ǫ9{%N{=DLF-$ʒ4a5a:P%6XDU5)w xEI"F4d%fEMh0)Dփd1D쌜= b_>$dHCk1o w$~z:̎vͩ)G{N^0 bz!^^qx|>-p0Hq2q Kx>Ǽ?9!&շdڽ y{"4ȃYt>UA#pO (fQ^ 5o5gAJCI`O|ine,P4٨a\^ne-Rl/jn_ fggji-\ -n]aڭaC?.jW ߫'ۮ叓y`Ih`t<[콯"ޠ7gxX( Գ㰦Y??t{_|,8*Z<ن|* K =deQW!+^^NU E&ʄQ;@ブsszű^eS2&,4 eT[e\$WmZtȔ1^լ"5 4멡L jO,U4f`x _ S{b;0 , dt0f֝FmHs҇ݗ\٪pL(&)=IP!-,b8kU_=(WÞ룑Ee*w3"DG9jv(u+lymŹ!$ hCL穡s|adf%h4u{ud }<`e;/~~߸gA+HЮvk3(ՆgrY % $2Re%MYM\nOG9>79] wE 흄0c AU,zYXݩϞhc) ZN>zt|yu5 z9fc+P?WZ|J[<6y7z->o8x ,d "'ԛEn&B_?W<ӛ` 2:YgI6|iFJlvs|ܡ'E&` B#yV3|| 9,(At}@d& 'M0 d+/?eɝ'{Ji>riN0+i1/Rd,5\,&e(xv.EPlOl,Z;G2eGY20ij)b1w~^K8 WW7tBx)Avt.ڰmz9i2r%*Gi(uŏ0e%V_L[pC&V#^W#rC*cvW6s3C'x3 M`SڿBo;ƾGymwn0id܃ I.߰g\`@QOQچ{ɟ>xdjڇY7CYGEWUk :@Mie|kk=X[/)4R"Yɚ95H1ŴYX[Y08U+^JD>1-j{[8k\-NjHV٫lz,`tA'JUjkpFv@G\dOE)]MC9(tJ izKOqP(H!, a!z PQ Fj&37$T xLq֪Mnx4,`!1,$;BQKnF*fN7`!$=: QsMZDZI5q:0kNkv+s-XdݜRMJ*GX`z Me*s}bD;xq$sWE`,`s:R*XYvT4-&$[wQյ\]յ\]W#W|< }:Nq41diDKf,!I ZT%2iS,PV][[AQM_t7T Rei,e~/ˏ08U'Z֪1܇Jf8ZhpJUE9n bk5 ^Ht*BM%C*UJ*2CNT5!WnVrN`7a,F{(-%&!ҹ]6 p@[y2`c@!J hMvԡˆ<ϡ_81bU~R{P0 sC1$f$T!D<)nLoQvzp/QgyK&PU  Ā`'yuܟSq%{QKjE*][7pZY_59x,WW 3[|^Tf'W2'Bc&HR'%Qt QbǹRGM>+bdKkڙ-mS҂1)pY On1 H⇬1Zt1%#u‰*1~WVrgNqV Xx@&4ا ƌZň'X +2؟UQ|/R80I$Tuq2MǀEt7>^,+$K>J1d2ċ{U-$Sdn'KS1)r0I:J D֤RDZHhZڋo? * 1) 5F~q$"F8C8(H`EȑH#1=~%ͦg/e_F;}Gw&twsSi -y[ji->Qx]:|fi,d9EH "ʞ2r=ܼrz.%rd=U5{ y>OdDo"-PBP5+hϝ6|ȩ9yQʽ>yQvPal?%Hd} rZ>r\v/凁ĸmLr./Ƥ'rg1rm:g2YMGYAc玍mgza|i里}1[.xq9wbSIyo p/>x`Ɯlll"+$5mVF\*0!j5m̢څآh~#wu鵎IÔf"di :I ?Hˌjg0sv//U1j `]$hy,+>@9#$넫,Y$3DI2jmE(XI.bG ޝ}/_L7Y*jn9~ ոZxhbU{qVI1SbҔ$ETJTG8O3002ʢ kQqnGMzqX i4gȖ.\D]R&ׯEi#CQsm'/αŸ+y {zC8Eq@b/I*(PP9)xN^D| r/NT!=rgK-E^`bL;}w,-!@'/rRr~"2loY?òCH/M~jnvB2'G%՛%?xeJ(v*4d>9Oe:N"2C툉c?7ULh67FWQn5&){Rur?b/#rӲKQ3|9c;l2SMKYn:P8ꖢurm燁Bc/^BN3QJut6g l>)}*K9\PGً [7Bo:BMϘTҭ$=+kd8qN4UTVlU9j؝:t8q K&dRHe(Ơ*$G:\$*'B4ϐ.c\BBv 9#[K9Ξrd`Xn'G%l1fQV-B2Q19/U<I)1 vMaO˕ܟS`X&L8 *_[srƖl˟=}7MՂt/B%ZtgR2G,1t %{s+Bx(>fIj(Wcċd r?T:34$m-w8v3#6/2E_dӮ2H0"#^^D&CM^V/!+&t>V h= unIPn[ʑ tD{c!&TMmlm*Cz/Unv^I1}(裧.8R^4_K܃n4{|L:DŒ$W aL8ΩB:UH*#4B9" ӎQUs $bd8 1:#ZJz<Hu2h~&/V,[Ng[.NN%s?z8.Hq(xv.6.až|f u"xn/#'U`AN3{k'jvG>>`Ŗf.}XxJDT.oTi/Z)#d2wg.FnxjLW(#7=O*鶤a6K!r-4{MOx۽#(^yKH|1Y=& t&Y2_^ϲ7GavXf,mh yj*Ơ|>4)'X7Aa8YM ( 5%ֶ TE勓SM#T큂ԃ 1^9P>r] g^+ 1sy|)‹::|n>6٤vj)S fG1*=5N|-f4`I^`bCNv6g„"x*9{k Y>wOn0.b?*tm(zz G.׊e)p##\I0Zs,ƮkZ,l'V)x;af;\ f})6Kn &rȑO`_L<4,)HSȖ~g/6A>ޔ*<~C5@R[Ej>WE49F+i 3ў|?t1Fak^hP|#i^(Ť&MrVjFy,S\wJR*R9 !+*xa*ՅV'kqP D+GZ0|Vh6&rPu^Z Wdo#Ŏµ֫qh/Ɣ<"QN͝I6L{ 0-xō)..5XHlIq2AW2yr^.ǥB?D4fZgCwnuo[~~QEtI2woC^{8qЩr{c|kc2C.4K̽y>)ބcֺhF>UY1`g-p1{?S(^Owz`-0;+$gA8{׈ Jw[fu̡&.b%s[qD*Fl0*#~${=Kad> vJ9݀l2Iõ5r\D> ٻqdWxvF<0L(YYl'DZȦ\t1*}dKŎaBg7*/h7&Uodyt[ OBMFW`mjbG(NN4*^J8ZF-BGr3TY6ȋ&5%.<&p"_Rˋ9nI6t@V[CJ5Vr+dTmvZ=+TDRKYdlfC Bƴ˄Z(y9:s/jg,p4MW|4TQLν㎹::fkMЍ"ɺEM02AyD0sg:8G x]> q= Xwh<SYչ~Oo(t TDaF c3@r@1c< 6xM"*Kӿ"q |6'\Z50c E,Bɏs|Tyg-kx)cnin@#(و)M\m$o] } Se1[bO?&×NXy$-?Ee~T r\j>{Bĥd]KەBg2x!x,`nF]R\ɴ}`JVL.%ɯ,N]>GAyS-J ̸s´= P D Æόwb]Rf .\ةRk)I9sD'ok%*T]eͲ=(PΑDEIŊÞQ}f:x]VRScZH?EcH^1!+kG6 }RD^!*o7Kɹ=a9MuDx˜.f@J۱ǣK/IQ7sRjD:&sj!0h:=|EtrK`);H8қ}̭T(х`aGg :޽6:=u[kN!dA1ȃ<Jf`̍#z?avd($w4PeD'L{K7Yp2ANyzS& OY+t<@f %+I#xzHrD^|q/J5mt2{j:x~zGTzQ=XG(hK+:M=_3m XDuLfQ8-)[X2|GZ0R at&??f b"8A ɬDf $dq .׆6shC=d ۜw?ϧ[cr% RYM\^{RW585nAu`}ײ=:՞{NjjL(5]}$]+iJg *1wn?ϡ_a|~AذL ?{At *LKRڑ}_4{f32YТш}XOKHp tBw0hUB-3L8( '̈Y)6fmپC_ 8+my+OFRnSj.ebRDaYnJ()ж(oZN &єT ,IjkC5\91Inѻi?QS{+T/!^r}![#S{LuFV&n-V0IHogQQ#(`S+Mv>/n׈SځEA>8&Oyvwdn!d NJd #f9T-H8!n8OCnhiObg,)q/DOH99kFijk9\/#S0dn zk^ Q7'CU*Pc!ɨVsx7XkЅ,#[jy$(s2RHeYITf9+eNw$h,@+gtisa>!N i1xkQ Z\8'g2\\x{a%k Nz;1WvkѦTV*3!+U1SFSٕr! |15ᵗ3v]c[aBv޲A*^'lv8 7~q4-rU:<r8|<*q 2}Aذ%e@z"/EgV%kUH#z]xj0ST=䖷QJ pѽ֘nC^Ƀ ??AaZs&\xԞLgzQ_R֯b#T w>g{`ދ`T k\ \es>5} w,]TJivUPeuue;/{-gU+rZi"5 V\yi H–n] pWL,ó2Lw?LdʎDXs֚aC[P۵7zc5GYm%&yyOy0!룷K@0?> h|:Yy]i@h*ujMڿ?C#ʕQL(1,X9b౩Ʀ[DrT.Wd2-fHja '()m#T<[qD+(ft-n@0%HY/Fg2ְX1ڰio|u>t 6 PS?ڃRfzr+WAA[k~;5>;٢ |dJTF7\pQz^w>N_EK)j@Pڿ݇@\BL[~'+[5?J6q0 Uey-QM2'^NP,hs7JGz6ñ`.Znn7~_9yzSsaJP#Z G[bgӋ>Kk7u_*7 ì'_i?a]KSjt_FZy?f,f,G eRh2ކOU]}Qe%O[kV0IH$р&3Xzk[e"tIuPA<4t] ^j" %`Qת'VETҎŰ ]jK-[xpo cS/ICoE$˂g6ZB=ե#yr)}YpT jѵQ= ޴ļ/*0Je"M= :`پ}ar,dRu GA:KYm}>mp:6FwSZ=&ԻXjm$kkldXsz,mj0߽֜4PJ[L= aw HV drvz4ZvգO+0L朶DŽ1~>62 3:ѦRܦ\>T񱡨׷ALVH.U6b(HT;6O4TlZ` 6rϻd)G|B;cBlT@ɽ8IyߦG:6*2ALl*6g(禊P~ yF-rh걢,zf9T-CEKXwCB CO]o'oUփЊ'D0o sk6{&'eqaG3XHnơC QB:6un( Cʣ X׶Pr v*UƼɲp+jĆN\N\aED D͍v0r8qӳgju!g +Dn{Ɇ ?^!0+`y9O?%w{$vIF{}@bṒxj 4v˗kpu]-Gq+TCv?yU6.`X %`j# *eAI! +ܴ&"݃4B+@tfR=PaoxY?N1A+[_5I N \F6䑠Y#F"Gz_CN:OѓI?=/+ebN_S笪cN wy%`[CV9KHͫ-<+cE1 .̖nE0]O*(ar`'ĚbZeUas Rk_^y{6_,!T'X%$"J[,%5pNӜhW5vA)vz$V_~$I#0k*$@vNR90=VӼ)&2hNOZ.!$N~]7{*{%oTF܄{s =7w7\p:j{8%2(~_̽P(|9vxzr.1[[= xIcY/ ա ^,kriڒ|ޏ&Gd} Xkld;=NqЄ)>LUF0Zg/%u@Ey=j {v-kaz6Yyu;?ʇУ3+fIfeIFbNb p \zdH2h'0ǁ15[f-O8yK]/ h_#Y47.Piwx+X??ZaQY= pָq<OXٷLߍca_,8$I^kWM#k62181#\pF诿~@wB`te}5GYOw_]<(н/E6ʿf[ܤyϽWMk~Hur1e 5LuUeWFs8Fn_>MLEzRV9*݉a7 fĆ %f'3/~Lu7G-ϣ)@I`w~2R[E|%'“dnJ%^h9x/ *‹<:v##^ÍHRoZ;=fH6^ɗi͞3J Acjq>o5(8°$x[{m?>$,+sL$>Rs휤֧韶߿yB2j9VA i8S8/RnE?!!ß%<RT\9rB^17w_oOyq@/7O$OY.xoE7UTB(B!+ # VU('4)T.y &<&PЂt^wF=O@QBw?9Ƽ&]aaLn@q,p$Y؇En k1QSu|7jIlԼE%1n;TVvȊ84/,Wm3-CAأs VMټ%z#HV V ((c5P2ymb2@JO>.:>X $ UD 2@b栳O`Y~6l+7$STʆ. OXW,y9<8?تj20DiI-P Ei 7&N֣O {?P8sόTB1]v_GrEk|h: !^>eykgI! 6$] Jt$zK[9_AL1 0 #&!᠀wɄDF<ϟSZy)`d۸EX>ߗi$.qka )nQO's}2}}E@ʑeaV)j@!h;n!nQDY jEcq1J0ɯNwJs)ph1R~(P(T BF6HsUwr.8Y>(?كQvOhwd,Vyc%Җ]&L"bH5X(2E*SEn#HMi>sك`YQu~%0%ݩIhzV #Fʐ0p! Tk0PuK]ߴ<G(>Z g,!{Zc=T6)!7,-KOo_;~-*mV.X GpO Zfe;8[j *ιH7yUa0RG/xd'u~ m㴫"J 'zA "6ܽP (E؜RNv/U??b,O=,J2 =?_i@)"Y}+@cqN V0DaP hlSv Jہ$%Қ)?!܇qq(U ?ax*ZuzX7 4ecjYuI5,P#\OyH׸ ֥HRc#j.oB8Pd՞t wi9hƭyx4^qE`nxtYsn+k%Nl3n.W9 )nxtPb5:;agF &jle*rΈF0r8g`C *yT|ң*wi6oH34/-q] :jKI`޶tK@e= 3!HqYw~MDو&mD 1a, `DK"i=8қ?RsAe$tϵ:O;k0bҚS hR6~aaV^X@P!es\*gKt1Y/oY ZsLEfyZf+'v|~ kOw-(.~ޣF;sX_,'иRpIu(1Tj6 ( FjbH@ -rVsQBKq4ACT J a[TWCN/Zxȯ#i9JҊ!**p"D,(逹툙n`@09|;M r ZRLt?9׹Gtr$"|}deypGH['i k C'!PBDlt/QSwmY8"VzC])+ 7ty\=eB<\_'ܠb* ҹ#}^}9ڬ)X^>TOۓr7X 6 Y-@>"$<\nkVO'V-enEC3,tt~ B|Uū(ƻX0mstt뼇UvDk6qtMcE7e#;̥f3F/F#b餬pi .G8eFV!+eCJdz9 h wI %4XHX HX6e [UUͻ$5Y[,dڒ(y˳`t؍/B%EՋXaaݤTG/KJ 'mU,Վ!/"J+)O]֜ޗG4"H>M5Oy=X%J`'st'|Nﮗ>9'2*HRn hCvTTx^ h^h.- <1@ZRe4f(s| ɼYQRHD,{Q nGzAP#Y_f-" Y'Xfvc2-d;?CR"pGن [=U]]uW;XtB|hS>_$QovgAR+>{g8d=N#ojdkP䫮I΅b^F LuC̾3K4oʐyos:E+8Ze Ӄ_ HʮJThxG\>s瞍ӛL &_ZコC^m5Ɠ]#Wۖvmdt$__x8M.ɨUA'-w0i>cbѤmڃaF9n!HIY$+ lObpcQ15NT1A9/>҄BJ@"}M@ 1b*13Z}xqP KIOR &l!;Vx6ha2xGiM+A,U s1UKBũp8mmU}7QfѿƠr`Ϝ̬ T"P}UhIaA+Am,#Kv7__#Wѓ3}?FihXwBߝR>]tmLqq~-j61xQ:ָrp"Ҡ(xz FhQC4.y0}6 'cR>A nCD_ځ`w.n"|5D/AdE ӞSAz^ n}xOxlԾ@!P@,#DR F27P":Xco S`NIƈvZ>C!ݹ)^XD[o7 hL_Y=\Hʗ} 8u_];w \}:N}]uVl IIO$0PKq9:}Cs9H!d3*>ͧ5L< `9w2R+Phv)6}PTl4=m L0x8IoiKR+t.W'1VD[",^y{?) U) w WInu#vVG{%\zOiR$)HnnuCb++jdA`e[K7XҦBLoC$a&E;J3WVL^ky'5]ow)[Z7s eaB"Lr^ؐb>9I֞]G]b#S1>˛i\JAM ]"c_,%meM5&nRP=4 lo\5g"uBcO,qԶt߲o_@rHiXSdo&e9S1,`*6L(ԣ:;xeRp{SNf_]r\< X~T ~}d>ٗg1ɳrh0@Ҷ0s&$Qc)fi ="O>|;v$e|ov22*F )1ٗ/qOyp1ydxuޑ. >k}jή?[;,6u Z‡2$ootStͷnN冏PmB$hC48O4\Vp hn!JEDo.q,Ždu=e6'J}Vb:.6t: B~ڤץ1^wcY鱫8ۏuavyX儹mObGoٸyA\R+r$:\½Mdk/!ɬ5tʱ1YX7t ;o0^ 4E-0r'dP* |$0:`=kj}`ԓ(#,R"PZs?uɄ'Y1/aŷS΢s)\Ɠ(Ƿ8 rz7sdgp^e+q%0}+r ) L@5(2.f1AϏ |JA:;q.]l 7կV":mG{״RF"eJJo˩Z~vy"h,5AMU"P0J7VXQ}W߃^Vlǿdw{/5N~fWQRV)b^u5v.pYSh- WܲTfxU1(ͬ+-l a{x;TmEfpeW#3XQ F){N<ҞԃO'p@Pw{Ļ6[E^nCCd̅κ j&5ꃜ)3z#\4!pَ1NZbE§a֠ݥ{1Π3HN#pq-i˦ s:H㤋BY%̶0&uYWYf`w\{iU r !5p\wq\ #w0ֈF# nLy~d<|qO g}UeBeECf$֧|1Y?N9%GH3ybezI<_<=r{ݛ} ՝ȌPWaFVuMOIftνĶ yǟG)/^.jH/1Ȅdϳԥ8eHk*NUg5($y v$(rT.$CԷ˔ )P]c^dRhw_S\C/ュ!"/-8?*Pe*F`KJb1ܻH#kC'Ã%(A6T^pnuXYMs gE*Ў1! յ4<3(|K>1X8 BV&=@]: `|ށGTM>4GY2RM .GQOP\QlONٗ b ΉuizK84<+\uӯfޗӰͫ 5W3ժ4%9#y#;QLVpd_p$bs# 1Em6悅V p31pH6Gu?ЗK m:D:)W 3r`p=ML ֤k63KTPJUP"P,Rr-%1ŹcIJS! $?><px=P#|jx~2ҲE yozA$X: u\]`X"5,̆W1%HV& HkG"am0FRLD$tӛSC8-Q΍FQ' V=lԂwy!$6$t$<|Sw00Ǿ^|=T 罵 du PZRŹ Pő@m}5zK=v\"@[훩uBaR#80?l$bTgLqO-RNxn,y *F\2⠄ZfRYl21J0 q.]l!4&!Ƭg K"kڶ;PXꃜ)ƨAɺ8aTGKi 6Xmp_r9>IoL,[3U-NX-lAݷ6xoFX6(ΒQg={U&TjѰHkFi ]>?zm]<ն^WODߣ~t`h[=ijc߃I~j+*tn(+!K rYk~lzi9"q!3LueKKl(|H,P6JۼvjGŠp"[nv,M#GsEUo`[_j-] 0}Y[l^.jLS a݄R͏Fl[׵njgژ<D\%Il 4)Id %2aYJ~v&xp|up-q'JH,1db=\)TUe&&lsiuRW P&½+e39])Pxc f8!D•sOO:QQQ[:twXIFSb̅<0MQ,ڑ!ņHes.xV[ 4:Ÿsw5޾xp6 Ú:%-k#/@Tb͕*\-6:87 ,0\`rVܦ6cm6Ga/| XlJQ\~7+ULK TQh3##0>BHNJS8C<Œ$bTe9ŒQ=])y \XxmôXg֞wRW0oc%;Tɐ2 }&ΘD&QľiFi2Llb9jƌa LϽtѸb s%3$0g'L2Bc,Z|&ަƪ bƠ7aڷc]Ρc4pXiU0=-,#r1J1tl6KJ&`\f_T4m9U_޽8MWV 9\IΙ=*? Z!]+M֒[i >h_V׹-KZ^?15tI$0 Z)v+m#IEVyN`},0/EIID%m†/232Ό#{t"Z0c_F X$Qy>Mʹ.(mG@8^Ey֝t?%1?yI> !hkx;OՄ4hoMC ,+,ӎzځ)pv$&RΎS hBƒZ06PFc1r#˸#f&+8=h">@ax](A|FWnÍR.FMؑbа ^`mGAukаQa6,6u;""՘\f} *:I'1DCs\N6Y]44Cxh>Nu*WN:Qphi>i~)$VjHNF椱':|m 5!X~|$bM2zKlGwo!tLiY'ZhmK_VQub ٬qm}w/!VK{ JZև@5I@/=b77QD.KaG XS6 ;r*1A&C{tb9WsA# Lξ&msbK=%it}< 7f^-뜴Hˢx̪:cQYʺ~_N]͢~lJ]v\A63FVR xo1V]h^O.m5q$N#3[EqƯz㏨vpիU`GI=-?Bp swhebq g<dd\N$Yڧ/Xk񅵰ֵ@B`Wc]7T I54Rs֔*f16e֖*=v=il1iU,uEdf_+ l;yݪzPpzhGwx N A2RC}L+dIY<9Ƃy1hd7faD|pu4 O0h3O4' :,:t:<ooBC-Ώ$2bH'y:1U2Osibghzx_AXATew!͘g)[d;f,fmkxְ`490N]ʓ.hrvN^ D3\ >tL..aY\eτA0lPͱKwb YeKBmEETmJj$ y-%ۿPRz;ݝ512Ϥ21X&j誕E+9٨įw 肶s.JGMw[_%dZkv?O As9x~Ȉ9/}?G*=ȹgRONƎM_WgBY}h"/ĸ5ϊ1H1H:OM_lm$>^4aUi N+GZc 2&1ΰSf2f0EDD#5UNerS Q`ȑ5Pq%$(&LHk Ap'i}Iht']O[K$9{X(9Ig0+ /IxMZ4I̠Reȅ|Jڐ6X6iaʑ_ً1 ?豂W,)<9J! sz(ѯv|#&$}ogB? 댢)Vُf_qLh{>~&Nrҍ扳/NSn.XoTtSHW|ƉKJVx`_vDԘSzzXO8{.' dZ BD=_ (ҐSLrC K+JGNZRUlM|~]u9}qPCN^٬C-2Aт9{cZ0 ۋoR_ |a8[ko T[ B^ЍpzF\B:OcҒZ02l,n.>ٽ*D ҦIL_lɋ{.zt"z&4=Z'A->>] C QfAr7ʑfN=kXp'WU߬֞!COu ׻|zuÎA>E&{Ԛ7U$*x]o҉F!Э9?NhQ$At>3+T \I^?hn:Cq7uvY] b }hy,A~/9nd>61'j噭BOh(_5/\ )tsٜTޖRF#CYKd$3)ӵj]7䕠v88t>dǾo}BнӳqO˛L׸C1W#YIQ̗Ib>sWK`Pۘ͵nN`ͺu }GnԚpjsD!.ZaA^!9ŽpI. eZ0mF Icw^s:,b{Z %)nLC^Cam >֕cU"3Vq)`<`=X-MLiIm$; ҝD@ҏ# 52Ѱ`z@]co3悦@҆8fg]7G]`{ u>=p4{5_ѴWR-m߻z-v9rNkff)5]"K o;e_$EgS-|ѤK* -C?a\^)0 5Ьe^x2`d- J ɲ cn&jҿLKH=)K~(3|ƘfCqcU[n!PwC7Pw($ߜՃ'rwuqJrk."e=NxfRIr "sfE').ֱd_[^f>rMΩC;7]k9X;R)< G,5w:cv'F66 6a{\7%у,>Nmګ;y=6>st-yr7%r3mN AącZ%Mzcu2\~; fR^}&>0&o3y{ B}t|A}Tx;2U^ĤDD=x53?:mIQpFs|I뀷sɗ{3dw~I֐K2tH|Sv0J"g2kH )ji0X<|dAlΚ@O8']0PJszT[=SXcCFg YzgʈScqcȷ!#.G1[S3ItɬHa";`> e5Я7fL,f R++'o33H')3jd4œq@»Z0F :W-k35=.X[{a,z#5n\M,1@>CجX A0-+ECBL%Ovi^Y*=4Ufyݙ#Q%#>i`LF8ޱn׬qkB>x_%P=~($y<wq9t \U,2Od ^c;-KrpXѨG9Հ!KVOE>ErdbA{xֺQ*j̞SqwG }UBI)bbqexlA<Z&8#ẆRc+ǣ plc'cG&gyj +:sˊTth9GzɑHGIXܭ ~dNؘ5)hW[rlzCCY)<7\tq?hZEcc:a)7TX@[- Fnn;q ̇kvOp4ƒy=&kM&A3sN{7CfΫTHLvV` fh7fߔ$;| SvX{UlĻ6e?Cڭ7}(/UUݟ f@ R *΍0X'hq3ݨ}X*8Coyvjdq3oFv3[kH?dӁs"m+]3BjzQ}^yq+rǿC̝e&TCxw/HƚtDLC?JġAτһz +WQTUhJ cWb49e gf?hx:b -7'ΰ> _}S1P+%$$Û_SpPΓwshA_ha?˿e6Aiq,4gW+g{Da'>4U>Χmzxa0=8pon9[<]~f.=9l6~ Gѹ '={\,t YaHT>db`>3v 7 O.S JPH`/2#{6ޣzmY<~‹ZX HW.*?Jn؆ɫZ+mehyyVY!z)&P 6m`EaԪD9}sacpAɴu9x!$hjWY )6JevCr>a#)r̩kĖq>ΫP,)v8%dPy$)Z$)4s`6u*v,C$0g#΁qf M}P< ̽:+o?8TRCb.5J")l <-9L{Mm2K(LVLABXA'(9"LqQ;(-Sb*z.~i5ˎ7~pQnNybb0c`R\&P 4EB7]CS (|+.K\GCTH8X_L? d3墶+M)f5֧hCdjk$J:H'mf N'7S`=@+ nMb6?1EV%$(fv>B7)5N&smeA^b趍~'nŔUQ+I01Wr65UPTbSLFHV _ PzF-.l&VO[4\\!2B0Jt1B$frYRS]u@0Utɥ ;fxq\Dx1$Yۿ/|rl6O~8?٬R SBʼnrGD],iN7^kT.T"ȍ8ZoA[2d'P 8&[25_-?<8QYzN?"ѝ jڥ |vAGQo|[:-t^.QJ_[P3X})zR| 5*竴ll\PcM*1'51{ <[,2,C/2,rAMl&lrvsמD-uJ7#3ppnZ*m>* KZ%NJ-5  CDOZQ\J=Jૡθ=~M odHNQ'1g`6I0VE&\rR1Y<{5L&2`np+X9P豩QU GoP&bѶ(ɍq lFBdRSkr>IkM=drҥ̽_oBaå[?G*ؙC#o9L˺ u*<7 aCݧ JIb%%ީϩ O(d xoI7G p=u.1W1Ƽ q߹w1i=}+{pLÃy<럛^*q/:?T|l]|~M7}fru!tq WN?p)|;%ɶx[Bvdћ͢.v~9yQp+6j5)B,i/oLGfߵQFۆ(J$j;m8^yV%Ɠ#.5qq\yyw|{sk^Ze!TZLwLv){EΠgޗy,=Vt@Y1گZ}ľeD{-mސ~}i[E}ʌo&(7hQX3R˻;Qo;x$t(.x1NnF)A/ GU mFcV@&(<8N#|JqF *zqA8g _e=BTZ[;ΰ=G+G7 ^ Ci|xv/nV^ mz\@Jح >6jw6.b2= s{U;]{6.cLO-Ƥ_צtq|WeOSġ ^Hp(ùmzaXofG"Y,Z f[#ȑgb/{h`Y({lrgͷC,2qW`yAOm/9WdV6.|}7ީ6BT8xZyb|>ƻk~@K@4izkuƵ{޶^͘`ӥ?«ݣzy[#&s4|Jc; 6Pga8G}-Y;/u&Ēi߫C.-tdZ"e{T]D0dKa޾[fR165bRT,sj 6&Lv.S?hR_{CnM_kC>;Dvk7o>jUl‹C _O) qx>Ny[ ؞lk8[.2z~€[p)QFʬXr*(Z8sm=E*zU,лaYs:<2t|:<o!Q'Ӎ95 aDM*8eSR3bĎ=7AO1bf\b@UfRʪMIPU)i$*IxD].Tr6ϟN^H8Wкc XK1)DR"0l޵q,"c.?$y8@ !0+iK\.w]Βȱ%LuuUץ>GmjQ6] ۀ+X>$f:yS`_9dXrXT D.Yl e݅|lPNZtb^ks~$aC͐u.@\Wesm #8X X`RA5,숈f#d l)(dY&Ĕւp0Np94?0 xl25 @26:ecL)'nggk@,xNT߼:[q5X;UQy=q b Aw7& b7˦#'6ƇE<c/(=@xXJ3FKn"e*@m-8k5;} Ԅ鍞9`Fӫx:VduN; .|x{#wTIo_gwM/aI6Ӡ8>85O) J y~0OKԓK:cp(WҠ.A]> a3\R0-+4.½ŬMhZ2@]I%jNXfdU7P Q Lhp{lSdsRgjP9Z2e?v[N@>j@{"4qUC2dqWnKlɊւqcP+( m#ASUIsv%[$GEU@392 cmC5߮468/"h'&l,l\M 7K#(~E@voܭY)ncS= _r{2!T*g } /) 1/c+2 2.z"?Ps8ͮ$q͓Y|R i8_+މsѬD&AH-x=ɯCv`35NǛhp¸ْ `~Lc9*Í=4ꮽ@ vyKGj; W:_ r{uo3+pqPHEp[]*\ mM#GNEڑ%GCGgM~r3#n3 @>r̒ӑnz~nQ9tZ?[#a*+GǞo\kZ/u#?sIn9qSԧ :F KJn-d)զ*y jԖQPrN%Ց@/ Yi:zϯ=Tx>oDw 2W3J`QwvIYtEgM$R*0Tك[ث%1Q 0RgrDWk/ǜ()rew A86_5${Vf$W<1Dl4R+&8Cڵ Ahf:w[6D>5TQqHi%DPN5q.R-D.RRVaαLdUOnaC䧔7ͨbf[-9~_0} `xrXj&Z3O!P%MrQGilݎ RMbl"5Ƒ cB.F`gg;J@qϷ5pcgUpp>M^5GW?@nweuְS]/m0j ۀ7;k>{8#HE{t־Aqnyuɣadh0+v^3SCZ딂9f9#uM mS%c'}ۯ;d7Xb,jج5U>C%kcr,|K*Ae6r|IWrI4Sz ܗڽc2t#RWqJRRA>BrؠP -ї5h?k/5 ^́4B3+v=RFL݌:Zl7̶\I ]ʘ%Gl2Չk`hpSo_q4[L>B:'83X&aTd!`?$k Jŵ%Q}%J4*3 8=bI$,Zov Kef A&do$h%&Pc nFAƝ_.8[3V\૗My}<雹ћu\J~,[/R W' &h Hqu'Ҙ @^e[lmNY<#m_9Ԭ O{lRH -gxPZ WsYh{ɱNRlam ΨĤC;ߥ>S!)8W3BZA>*G7~e]Z}2rawhޔvT?}NaCk'_w3{J/q'"!U[6i`G^;"_LkZqV-Svk'-vy(5 S|/~w[43H-:$'v2en Y2H`?% //c?韗W\z翸` w9ַu^/ t]?⤝ clo?_M/N{/3hcy}y7su\.߬)+W~\׀Ǟsg XJt,PB*-H ZFy ./m"LJTkݚ4ŜW<%1ΰ~ی yp-oeGM,Y;%ŮR`2g.%є FuEWHaWςܢVHw?\YzX{J#/jcfYb=M_`/ OFxj`2LR֒XZcD Qj߀she O ~m؍ Vh.^ӄa9t0s 1- A dh}nh=ŧO.pvtٳgvO9ph@;,RaR!6sp{@mo--;vXưVbamy Zv 0I? ;|M ^`k_p ~q;ZȦk'Hg Zxjڿ,X@Kq8HŅۖi %yA[kw7&Ÿ|j|Jڛ V|~ m uiP;S-zcઑ&#ܚ &c O@1d[9FP,降ޫFPJ/^sd̢jbo}ݳ]/k<ҫdcHߖ0>[9ʮ r{Ѽ܇DF%yvVAzDKd]~Vϯ9Ӈ ^.2u"8I&Scoᚲ'dOjTfJDV=e͍^r6qKX +Z5-E`v)7V6J v&y=qWў'"m9pH@za?76$k}xJU%i<5JMu=;Y:c*Ps$P5I5gy4 1oZJ^!܃ɿב@*4܊P6 K| o7RbM%=ta' h iuJ\Ԃi  J%US`xt1,kawIsV,ed䑈QS@@ 2btT|Zt@ߣMcX#Tg`|n_9mWSZ_ˑ?ӯGߔ,Ex"gwkۜ)jR͐l]m6 nDC )tSj\f!W;#:m)Zl p dSӯo[r0͔*6a g}+iI":~2)Sm&*śkЍq xl^ WS&V Ӈ!3xF2LXty8"X^0C _nCh:l@{pzp=V^4K~LY8hxZ9cwao} G5S<$cQVk&'}RKJMUvA1HC#¨(k%a4d;q Zu`h0p(7S؏B"1^gud#ʗTU:U>XB5I0?@^܉u5Ò&A"J0d]-8ꪹZ-=z`EB% sIv N:$RQjȉԴ۪nH:: 9} (95.ȲQ4g 5x m6rPԄI{ױ66JsJ 4C80.)4{ 0=#Ll ϞuڥВCc#B.е[(KNRB>߭{Dvrjo GP9nW@)|)dċ4l|r.cLUSUHQ-yD5U'gcI;bI>x}/&kOI8 O 5<`0 Bk;9~-*BE>~čJLM!W)<Z".!Yq}"6 N6\9+4B!(17[9ʤ!iPM?fut\9ҡod+o({6] rjN${+G Yz_pB ߬x2%ڪJbLJ.C*hzVXP[h}![k(x]dDbqy4?=jՆ$%;ƌn"AHrr:V$ |F5mAN!3Y O Si|n Rvp&&0%!e¾Qr;A&: .Fլ`)+J54D\归bEq"Ԝ-)\tեB>[:2\rgq/]؄\3;\9=;O g@i/r9梅aa$5NBJg>?_O 8%88e]eXsU!5gp͆M6P-&"U_uK+2OhjR4gzk̓6\8%On8RUzɸXOvD6vbá7z%91ghͨ*Þ1cI? 4,}>sw SӃT3!ɡŤf)*.dK29(/پȩ;˨t<'Z=RѥUNe=z}sJ/˪MK֕[ޥ/w& ͙2 !Vݦ*II{{5*ST]!Q\IZg<#Pc!5~;&wԋGT>Y A<TT96ŵfD-s.zr'z' ؙ=H Ru1Mtv!(/Iqh)+$j,BG`{hw?<L @[K6T,:0Og-x]`rݷsbw03c5$9BϣlyY>\v]ޯ Wk+ 1QlB(ΠRÝ»<^h&kl jR=bT'>e5^<8W0PXjVPǏ52PrQ?sRD҆:V 98[k xR pdT^5%lvN ~5ٳ<)(!<7:W;tu9t^lzN/J }uiQKxin荾ݒMd"ƒn($:>D=~yX.{lO./#u5td]:l_A4>fd@.EE7Zj2dIz=D¤Xேo{Vi8;^ kn91ڴ=WSbUėfhIL{~i{ak/` T7 9"C}%eqOz3T7Ĝ;˂=ƉP1)"axH7mAwؼ%̖P#0&z<4#%n91g3&cf(:܆?p]Ynoo^N8mr>e/ޙ;22KSvv(%rv+JNw3lX&Gޡ.#JOK=Gp+R0 g[0 :r22xՋt)oKQyՋUPX PW*xO񦞕eſ>)fO;sw2Sjc~4sF;_.ߦX&BG1*QWgd`kP{;_g/0}G9\v졭}F;ָG._+ǯw\U-Zu/DF%K|z~`" |&[_?sկWf~ ]dz>7Zi: 󀾔/4HSW펦ߚj>rA^ABG÷=rpJN֙0{G,Z|VXm)'֣!frT.,n9vV9q [ݤ}S Ρ.N7fȎPI.X-n^sq=d [Y9Pxky< [ݤ}#cd#;]F 'l0@@m1}ͨdaţfJ'4'HW–RkU*v. Iz].tҾ`@R]F 6@ )heo nnI` ͕ML5'TvRu-F㘝7|=^&aCqO?ygUY-qHU'zQ NV-|~e [ wkuq6 Ղ /^.ݨ.~;D2Fa3`Ln^pHb&w`4Ϛ_4ʹb#UZ5z3!5־EWlIAȻW70#=HmW0aLQiYh37QGUm^m 뿾~pnipcu$tm+ BC,)˦Dqd̽+ٻƍdWY܋GkAdfs$ \$0/KG&e(IYnݧHӜG#x.t[sԅhF(/g[E(|Ͼ7O:.E&p4tsmWX| ׽L.V?d .|`&<U¡Wʈ0(wh4].Fʼ !T- )}BFb9.xEBꂥBW,TMe&7gl[4,2@l.N!LmDsCA&QqcE'*v.Z(zfЬx>o&I4aZ5PTg=PKO)2;Zϙh`iՈA :!!:'e$qA` y-j4 3QYKu;='(aɓO!I!+}KjJ1b!bQ]228jܩF PDPXȝE2brĹoYovRW/>)f^I L-^p`rO=Dt.>zg9c7'JhltЏ$ՊrS"-F1DEr)-w& 6iP` va) A! j gC|R΅4ִ9vMIԳEcnC5Uz.{&7)uaVS{J%!fw[eꐁ<)L`J6JCo'h &r/HxJ-p3 EE +/Gb6{εΔGEق (e~*4ũZk^[oҰ+9.Q+8jRugV0JlTTh8qv<E -ZӕWlI[ۛp\AY:n?0Uz: J :,creQ$I`'k.*G: EP&3kL$c脒#[> JRR}2ea43@k\x4u2g*ƍdB_yt5Һyc(C5-a cT|*Ѻ-1XL9t38/iOLF>T-#ZҒpH{:=7{5;(F0/a8bMS{@@ ˲īޫԩ6τ:EAsSJR?XH) =aZԽ`0խ'lΩ<F TCmñĻq,NK$v?I?uŨ DR%j7٬.^$hVHZGd4mh%]s6C/(Oczſ$ӕq?)9BSDEYT$%'7h0밧Emnʧ6^5KN+6t ZtT|ɇd2'󬥓}ӅY /nl*jSח>g.,JoUEW'M*dUM^Q͚R֫@<)u!޳γ7&_g5C{;r6nJnBzЊ5s]AC(pm2*khq?tAvCDj#R(F; h9BêOIXnՋerZJruN 㫳Wm:(_*Zi8jPgCƸTqqq 5=8}WjpŨ|7cS>]v/@k75cPwB5xibZQ$>|쵒os)w!.PYV $m}Rh»dV_mjy' W&oWQ[h}ѻ"0Z[Gz~qH3M w"8ly^]L&_%LkʪhHnMr4eR Ζs}'c_6ɲ,2㢘Vt.]흚C(7{5tA2kQ4F϶[Wġ)ݐL5G݇‹ܧHEHA~+]ۤ*W?u*>(wa'AA+e8t2`pv:$ (35޸6)TM{=j}Wsvr_+^Da>j#~꛷?oގ\~d<\ŧav0iJI6*vQj.S-W %jAHJ[2:ޥJj&ΩvVLg)Pr7t9*p:A$TЁ[si鈩\.թCxH{2Y¢}/zڌ 0[o9~+i>N"7g$Q i!ZKBGogYy!Qjc1p h'4^H 3+9yCL ,-a.cKǵ2fq9FkYaU'1T=z/Ge4*NRkƕK?TbzߋU?˛/Y3I[xS_t}2Uz WhOg{c$jqhۢg>-y2oL7h;?oBYX rܽe{!/|n?(/2Q[8硾`3QM j.k`D׮ Tp\;B)ZsK0}嗯R^a?~{tBDvL>LSFt(3'iR8&ڜ0?$NZ#17/:ґB\lb]i߿׷?]9'!gw՛7:_uyxya N_/|ξYJEtHNda *B*,]5q :2HVT&8=K4!~|;w>.lFQ%_^5il)ƽ_2|ltd u L-߽jj}uWZ4S;vu?wEa]ڔ䨮>_xCme6't2Y÷)4ݕ<}jaJf@ M\:Y$&[3r=ؘ'ۣƛӏ"-[g@Ŭg<9HP tt7ӕ}찱j]nk;l5_np=g]f#CLIԡt#충< *OC1[a:Z'u bD2-!%֩ٛv+H8--"3-oflPhNbc(x5_2ٚw<uf*wpQq sr ur8l!T8Vd"[QZ>Z$A}'98MB%W`)':͘&\ d<;rn[٨\C U ?]BvG-^{ܾCnvz:Cn M2\G62˽3sIZDv׀>7cf㏡9чh~rl?*ֻkkr`'`•;U{ǹ0B H9ksM`v3w+Ym H ][lg3@wGTmw3}0bmz֯E/ūc`jw'5dw.n$äҸ:h |h»$;*%_;dcgu%,~!HAWsm-?c1Y{3,-%j&:JL- $) Tʂyf%GZș@§K^NmJUڍݤ;CW'+y^` N))Ӹj| 뽋?<2F DٻFn,W,yh] 3 &0"iN$[*VIUAi[%y<y~sAق7o/`8a8o>Ywf:~׌xAŸ=7LQXۃow)T2A.'Y?FD-c*f"A1gFJa|w֜/}iI5͸8+ma4 ;ﻟL\L^ȧ8?x3x}t~aipڷ̺L.O绹 '/R߱L^\ ft6S'K7VL)ͮ_>)ecpD;K̥x:i蒕㩖$EL Dv+s,k%VmbK%hZY4]Ff(e1:+!??Hտ.#Djaf! 6g[8Ѓ o5" Z-[+}4Lj(- ie#NQ$zǢsVʷcv!ip?+JC^œTLRL;'ݓL'i*kaw%o8o?LFJTacwSB5lsC%߀J}B>p_iNWⶄkZyfG 7sNзCoB~0ߖY<8pw4KQ Փp2[67Wjxg NYgԲU@'+NDlP1 KK`Hv8-zӪc7e e@L҈JZQ!Q3}>/WDQ;Ƹ?Unէ]7܇+7%_ɹpn4?-糚da88cLk&Ȇ|xr0M՗{sZz7yw>b \??\xbqͫ4:\ĺp.َl\dvx6\"v6_|53]<`[3ͦ }6o~pv lR@T:-%`= .ߛ6_OXe+Ͻ6G`2[y.uń~Sx" E]IXb:ů Mȕ.G?_꤁LDb%Ҙ!urE$=,dqMoBxƛjX.6)O%SV:<!F)QD l/(';EIߩ/o,f[鮑 %՞qQb1q/91R+VA@g#kJ'АPۣaC+N{(DL+c$K0/2! 0"(`,S )DR^p"<%déJbe눓re!`GUxwkV~ # A;| !eW[DJMXc|GT|hׅp+hפBUW:r }uE*pȱ vgP)$a!3T >*fOg;,Q,5{aafŠʸ)C</CG[f/Ay$:٫H$TW>7‚{-ΈjnQ T̅OPjh%ʴ?)p1*TJH>HJE 5$,!ԘZZ&9WzO%M_n:n`_f sl]PXAB (u%fC,rS IsT#j]s3`?gĨ<})c[t*`onB8p 0!Ǚtx<ҹFoQ8/[P7J׆5B]p_SMƃEx?00y).},|1#~%:8/K$)N4#DF/+D-| xZ{vO^i4@22D 8F<#ӌIc2$ʨf!'VZżɾsklQA#6'|DQlJ)V D1(.RESK-!Vh'fSKҴM=}{h^!ge]L9TOl-p\|Y_~:+.ak뷋끻3,?Sn?$6$do22 <:kYe\e"3 r)7ͩM &j59O@`DucRJrR_usֈUG`ch86nIY e6#VVcLJ [l_?)VHÄ@jb1:ya2&'-4f*JcHzqn-_Se$Ѽ掵}+)h}\~/IN(b+j,vwZIQHi!)>>i rϝSU+_l5&Ӑf:% 4AgUA"Lc&sDRɻL]g5n4$e%uZ.obaDivԪ*Ș_sJ1ԥD 6\zUB-c*ʨtز1GnCUt6,+N5y@ד:ON~_Qz2sg)hJt]{8n(Dm"Xm_ӣbԙ4M^Rb֢SaZmN7ՇIBNaLjR\ՙ>mCW >q/ Kϳp%c+2\+ʭFqzr{Ȼ/<1 [|͢( & dh\A;K6SB9_Sx{i3v=Z@G%A\@ obnIdJN ǔS:kCK@62W5g/hqt<*xkŢxRTɓh{^EEB]""ƝWEz-Q2֒k &,YrAc- tr;`ϙ:\kUS9>Kk$"{&\OBU 1_E\iiX28 7Mh䦉Jinԓ&Mh_pzS*қYk8YhN|UG{ͧ(v^"i@n=:TOuB;I; ^hDp$%RBP ihʺq#PJ"gM.ǂ^hDI$ͯ'߄vhD4-,#$3xfVˊl}j:=akJjq , dϱf)4j3Θ,3"Eg\mH֑a¶ʚXG ~,#N:c̻pd×3J\蘭]'MӜ4Mr@NB;I;vIvERdF]9{JS96@+˩*,pQ0b%\ iuh(Bcb=7MVAGR!.}\%f1Ql׻k4(K.;㏙}Jrx&( 7onaRS! kAT.yX[ -ĄZRznCr(_ d | ؏fzݹ7e\KskYgIF&tump8z\/mMh~ڶ߄7նUd-I95,&ԛBa>M\X)lxh$HdBFf7MhdvTUXoW; (N=/HxDLKs4RhH}Qt0Oμ9D`W?OENtYz8+],L?UBY1g5X] Ҳѫq!Yx 7nj翼ށa m"`_n sWRs~2u8R&e+xUB?LgR3!Zb#n;_ ':PLQM֟mP+OʣPgDl>K]K^kݲ4miUjKTgumقLMZs61?{6#ν'J~ȇ޶gqEER[v%;mO%?dǮeKvܬ"eJf8!93s[ݾ5ݐ5@C r)A-o0C-(2!㑎E*ם+! M!"QqH e,R> D(IN[>VgEq\Pee2J39:'AXryAΞ o3闳b5%''f۷ Zl KB7_= uDq߀n8fBV18/}Ii>@\i:ߝ>J_$d,pXq>l||^9UwoNO:iE?BMa ~;}ڻWZ[֞|z?::7g^~돻s^:_yL}hP9gӠ._ډf4o 3.i`vӝ0y(:~dsE|wx,}z50u* NY|<;O=0VipGv:Yq't]z~ktz~f0:;{>L~cu?/П0Ohcf`ⱠQ ѳgi4wOuzYAը({ơn Uq^97&K3 ƣni؟?ҕ?Wysɿ J)+  EQzO׾G"~7P>K8V톃W^RaЛS(KL%U^hp]N8qx+R,QYҝLH`F22Z,tlGoIMz02t/fsm10!x̦v|fH w:Q˅<K]~c|@B[dA;8 CИ-yg;O~ɴ NZF{qϨYA'QJaB󱷓(ķD!% -Q9Jm݈ PHQvb!#z>5Z: ԅ9)\b ĚIHKBZRE PT &z^o\IY“|Hb%W<~K>/UJ|W!uD 4"A~ᓧϞ:腼1Ŷ$Q(o۝iӺ3۷5%q0oMSmډ`gkY;鈲xvrjAy$1-R(ՂCk'/r/(ڞC #;>T,1Z:1D&;Un*V=PQж-:CFWt0"[#Un*ZyMP%;/5Q< Y}Av̓Cu칇$Wm5pڃt.^+qȐuuykDTyqR7(?f@WR+qYҡEZs&}[^TYy\կcqY\Ogz.,YEwt?U<׎tGв  73MVnb̴jKcUjiW:W4_ͪ{I6R{dս)>4ɐڣ껷!E;{+4hjYAw a/nİ eƥB<;bXJ-ow..9EY2FG_Ѣn쟣"npl0ġw;$ԕ E%!.G>#6qSH8/b)BoRJmAe]K9ͷg_ ;ƅdK;Ϯ) L%:ɲ34QB4rcc4>0=?;=tzz|l-O]UaKhl,ŜJx0i0cCpq1I:ǀ,abt =*#C^&J#XBD"XES6hAX'1v g^!XF&`&hFE~\e^*aNjZK>3Xv&{E*ЭUuV BCBxgGRLHw)aD*-3&a0 co4^zLjYL:cQ gwqPF,bEFiX1/ Ԃ;J Y0M[gCi\Dax54 f!ԜѰ4iDyExkWYC8AORy9ҊccABZ%Q| p`%Hlbc!=!#NsKQm07qIYϓ$ӆQ?IZQ#Ƚrn={?yR7">^|[^.Z"`¯(AE"8=Wժ ,CZ0L)4üde)3oO O=a*;LxzG9ErN+H0G I>Vꇣ4$E2M_'R f|B|.G! h-*%iH^TݾL )׋:9TMbbPEc|{1*&$o.[GӃN1 _y81}CI Z'ن{'QJsh)+l<]kvq)<A!,ADFDos֦PΩVM7a?G`a٦Y(Fݵ֍8avo(>Ve@/(SBgB+Vyɜ=zd @*+ D.RN0gMO0Їpe.RO`kofUe ŵx l,̡rI' 0u؃ ɞCAyŖfƚC;t§6Y'1sDͽY5ԊC쒮cO 8Dek&)~g͠pq%*ur^j4#kl:GGIk|s,"Z,Ou:hN U!EWDo?&W5OJŘ{k.E0q4eWd̒D hY5|1N0jĖT J)"x&ԧ͋'vR]CsSS(ҘU+m @=R,LI2gcz݈ܢy~>2[*F)Oe cӭ5 cɘ҄|yͨh@^AH:'t83]dԷ@Y FË_p8 Cdt!חnkh6yLW!wOQ&'Qw_.Hٓ9^4N忆!uq2a (z," `VE$2,S^,"uER8doo6aa/ըAz?El*ߞس?KӾ5!(AW(Lꃱ&e0/ ˲_2"&4Qkm*fR&IĄWPnue+jYu_ $[oz7nnBtϲs+USYf\j]Z0+/C;\*f@^%ȫ5 98p;d;᭬{zS]oFW]{/!Eq I_4~$%Qd-)(r93kiU!";O>7/&:+p'[ߎMS.c Ln7 sy=_6}v3f YYkR39| Ǭ_(n][Ƞf$E=w\Qαt+ct#9@lG3+##(z.y`Ȫ [zj`|1wFm +X*ޟ\ <]pi"w%~X("nٗcޥ3x1UB&tc˕BRKBVB(ŷ*JtǺخc&hJ%]|- 8=76kc`'}W̥UᴲԨ āϺOcBh%ЩcYȄԧ1e!CI+d, E E5G"C-/#B]NbeNhd]%$~Vf)ah몿;vv֠oxDRB;9cA7Kl96%\0df8iDZ5\,W+rM~̃.ln? ;;x3R 1..#ac*c8Qt!Up_0HOI 1i$pQFyT1c^Yy()G ][gWUibac EGQĠHE<3mE&,#_:?JzO׍H֧(KC}sRcDTXD4+Fp@!Q(h 86pZQ7!ƜzZahM#XK։4R#rdOu$;pmK ïӉo+d%lWA:ζi;,ۙmͤۗm_^/ȋ 7$iW rӻRƗgR lq։dgAxsyb/A+>3a?ufKl8-h Ԛ`hdl]9 zrkQMԄS\A 5RuLЎ4&tzox"e %`BBB$n5̭4g`9+,F7鞢V8Y~18e&0g"ӹ\SIB__6 JՏΠL{ϓl2(h2f^mFOK%m19 K.=ukE`A;m{vHiM}Ϝس.;1݉n>][&܀|ݳ.V`Pzձ)ՋoDюPi [FpfSd{rz~5¢LraA&y!8V,ڗy>]^}ɟN1׷= }6o)t ËpĜح :Ԭ_ 7&MYkoY!. ByFYߏIsIt2[!ʚ Ny~b=kJjҠ)D}.Zo5= ^}3w`^ ;37Ätgev^k_kiڅoWrEPYsdz/?'{7&x?}y5t$Ww`'lPiF7ѴS\Bf6m< Fz Ov8Ф Fi9)L1ghR) pw7g0>uЋ7 ?$ε8^7KM5&5o`3eK9˧O; z}K_ΟiP(J<XOq(D\czpΓ{k/0_z d{IyNg}(w><\/(V/FBXB'c1rך}lX|G9='|n>'{'TCFT!biyJs1&8c+z>@>6'qu򱝱ަKEױ.JIv@7WqlQw>C wJa 3:.X_,H!w@Gd4f?Glmyv~ҴQ{c5^},ucT-q!f"@ה1u0C*O$eL8 |7#"#0aQq& o>1"~&`zR$Wlhʌ%xvԿ3|䓡p}@v׵1%@yyHtZi`8n:,څXFi_|( ^vld`JG~@V@awسZ&0E=+. o?ݰ:0עk`SREVػZY!Zs6Ooc,<|x0 b_fX/V;4{cCP5#k&햱vt\0rn8?.f=IQVJ 1D f6,4Y6̂fA n7zBL ATTʭ ܸ(2]py[);YۊM00Z~bQKva\|V*Ki Wk?EUh)m ,׎뫹Li_V(zڗjVz@$ibGZkE*P-zIOUޓzJTgc+3M_V _Z =۟9=vRً4YY5"8?DJ,_O酪Xb6L^LgL5RbYv6&6a2#y67-iBeqD)a6kҐV)u;{5x/&D+Rn)>DWɾy۫GUcTpD*^gp n)?۾O$aĞǪ /։I7]j= *ࡱ:m.#Xu,!)[4PՒlܒ& _,h5WM{f1پ&UaTݭbaAٙ6h@۽zThej cOk~c5ifXS#i5Iq$tW#,ux0;-ʭåd58>FB$5Ctrƛu.Q^+DRS~tZ+f/ 83curWqhe\NM% uڶh j`O&D ݝa .aiTq$K).ZЏhrmO'ГQv.c켶A'/{',͆?][uޓ~xYJ Y| < b&tMSk'#˘_ >b!2)!XJ3*@K3#!>ޘ4\>02ZU"j.~Kxk>]P\߫(Sh[͹#9WEZosG}x,wK rG3ϕ()U;? n3`rƓfP@*ifr RH,*1Mdx@r|%'b?!T|lVs5n$+kp;J5ZaH_ MM6"LS8 K+XsyY1R_٤MUX/ZXWv #Q[W^U0ac0!17>%4Q "Qh(#Ɗ)Vat0PC׊mcIr/҃VRֻvΦknS$MB=/ HZAѲez{Z:gF1 RBؐR%0M ؉Vd{ٳj2IrrOѼ{U($rCEJT!aD[06X+QER\3 H8|ʥ#Вkm jAWB~aEX:4=#QPisfKt&n,ψLk)ջKV>/˓59[LX+QM\OokQ S`;\|ls\<~Rj0qU:@' ,IB:B_WO`LL= K~p ilսg4苛n[dHwwEr O:L殞朌XCE EEzOZ7q X;ޡ- UTf4Žxdb!ڪ `lÄ%1ld\&eH"2EhQ7+ݕLj`,_LP\EgE'ZS[*blM\?z׏7rĄ!Pf<6b@R!)~~bF. RE՗9y{Ks1rTW tھywIۜ#+2Pz~v/^֔T PJ*kIjRXQj`='`2aP\W&qprl)) U&^nDk AP$"(rǃ4>Hz" C;o 0BF{$@Y*XT@# !Kd\{PM]8{޸Rȶ7ۿ}hTc VSgwj5g)şw'bEYz+_-\tc_şsT%{ ݃NSUM.& rZAk5F$\|muEuط )O~Q[ D@ӎ2o*Ĉ{b5%ym%Bż#Af4/Tp3LbZQG:8L .b1u789_Z/,pϬ(h9r5iY.[Z#  0 i}g-g#z`f FmN I .aXD9$3*N{t:67a!Ah-Vl 8=O͍ q-@~VK-Hr\=H$Q >(ߛ-T!5ǐk}oҁ~Ř+ߵ/w䬴|gXT`8C k;5櫅~4z9S6ېWK|结6MZ[M:I>׶}k!4;T|ɡҗD D46h@# Guj(w`O4M%E|ߡ4w t͍nHS;߆$,E@ѵ\I3Lj$'Q7=w]oz xmǑKk &]JF{s孔%ĎeVtKհ`t/SksYq|^Đ|LƮ-a`;Ѕw`ɧ<4 g叽6~YgПfEâE>fQ<)隗pSwɢy.>ImB{YWs3$hY:j01ᕉE|5#TJ?-tp@'9Ӝ]GztYTgyhr9$ f&>r|V,dY^Fg?z67ogvwv=>˿/o;0&oO./|rz1|W5"gڛz'!\M??_oOxML:t~=?|9-c(~.Y0g^jp_:e>{`1hxnlҹ3.g{?xq'5$&7+'tژUojG>1yL ϯg//=>AU\,:l[iOިC}=|0?,F~\، JϏ>~LQ M ~1I/Fs΍9_Οh0}Xb̠υtOu2́~.gMwPJr`਍]ڟxhƸ| 823^_@J"~*RlaoX_`0W3)./3=dIq+9kl ;:s:g/"8M4&c {3O2^csPc ְ ^ >0 i嶗\$UOqAmbG!sJpMvuĥʚq4sD7d͂/ YFď(M;"sFׯ_ q-<,Pki>`bv4oD*!}_A=5<#?~]O?5ق%4i'IB['b{@M0!~ % K֑Nz@ {WwEQ=GxuDGyD- /QLnJ)lMܨ=N1:ʤb:H8\>0)V!FV$gK h*JggxH<QE"JEQE6 0-8q!D@6q Z2MbV*A!H͙GYoW;%#. XUH1DULpZ:o;&/<>P~5``MMYf\¹ͭ:C0qPo+-ؿיۤʼ<8~to_Cwe.qOIήdW1̍;-r->n%oqj)OmJKtDē96F}5Kɧ{oMh@CY"XVϛ gm|򬳠oYH3'`M^hEw3Xpc^fX^bHU!Q\/FiYUTBy*&dtrZdY9j.)Vpܹ #=тzq[h2se/l祌jjf+ `&aۊgoނYtHEDX\YxMC^ܽ+ye>h-c?r55yAXh.jW3S60r@!4c0\Y ) 4  IRDT5nS8vه ϫJTDq "ګEF$Aj/ ISiN[ᄣ 0$L4^+'*"#VwEuՉI$f%1-oI9V%4z\0&0R9Num$'@d8U㶢!ݑ(oSNZ*X*M&%UcxP"Z+ J0NR kR,}aMd\jq+2 Lx9k+d$F3iaiF8MIg6#$D!x$5VTB XCjAS'9 mkND3H)I;`ER#RTJc>Wya]Oj?kjqmDw P.؋BbG4ݲ6v:7f {ZxUZݽf{XiV pzaE! ThLJթ6j'XrbD ZU[> B(0ԁAM.,B2<&3*8݆~ XwcEuyK-̈[Dռ .'($p¼3t\&sHû )Ygr]U 8o}/3 8 V2Yc8/>@JPT1e`h T%7q 5DL nZ0WmՁ,)̹{!陨TR`k@P1#K۠b4ƘsFƃ%>NwҠZʹZv39xw-H8- l=͊d~Ehtw/{q;k̺AZ4=$!b}CD!Ah*p֤DE^(m*ysnz$c0 aC 6KI06qq r']xI͠UF0 Cn6;u0G0 b%_cɛv3=Va5JlʻEId~~)B(JT>UDF mL0 -Q "0 nb&\q (*nj교lN I- 3sR%)MYSxlvIdtMC"R x[ f݊X K46dLU$KT&c޳E^U$Qoi?!V\}:MZzwS-9K1N:,wt/tȿKgI̻M)9+Lm 0>󠱥P2oDכŝufOc~4),CJHJrR;RJ'T7TH8ƻHl6L~cr 8Ԩ}mYɓ;9TOٻFn$W) ?v{&^;מ9:P8Դ%QMR}̄&(xղ|b/_dCH(ġ*9` }o:g/GdcEjoMBea6P$׀D;nz6~zoa26 ]0]^EH0n xB*k*N=8a*F?aqݰNeCQrNlv+iְfk0fJ]>ftQx#èl Q:+ ུR+C\)n4ѹTjI)&@8Ud*RdڷwCޢBkDgM˱ i(@'^s/rpSc=p&Tn`s"S&5H\Ng}g?Dؒuub+DUyk/?N?k> .1w)^MԗI]EL^L] W1іir|逸[I\e`ߙ5]kc\Z$711g&mXV68[e8 77O=ԻMaw 16y[dҥ\lvզ&?]_Mo_'>u{xZZ Pݓ$*){g1b4e_ez:2ϕeq12XAL3Krs9,bV `(+{c JpLQ`w˰XGIhψziZ2mZK;':n0qX!qPVx|&_+!.G'Kw4`pcx*}J/-\k3uh^J2nOzt+m<4mVʹv P:#8@AN:5F@0T3{Qɛ!}-rh!LyG%L{n#N&@z@Ж~"=$J,hEFu$gCoՈd;²|UXlG+!N}f[0# R㰋Z'?N~Xmjp: T''?Tor8U ,*Zٴ= J=K8QU[nжuCۧ+Hf ]L1̈́!NeyӿP5 cY\ە uU].E<Vx}g/Yȑ{PY { i1+ܳy]쪞B1^oASa>v"<͈O"VK AO!qA&SjvhD"w*8^߬ibdRzH{5u}zw?ZGy`!8x\c2n< H:oc1NY!LXi|hryy Kb9rVO{#;*~><ן~JDeK;Jo2:.ޫMc}3J9ၱ ӞsmgC3#LDIC9Nv,Ԭ]H}_>4W/Wy J_Q2X>-Ί&]Tߑ+Z*pcwĸi%T #;F)݂ڛ̓*w?GZ=8qڌiG3?竦))|vc[NϞ|sɟ?!Q o;kzs":H @xm319G0 4dk_ytJu:/t^Y?In5ƣɡ:kԦuqa!2gvl1Í8Uf.zNošOC,6\e7̶Sd@ xզb&V@l5bBngn'oūۭA bni -ˑ|\u[b3HƜYNyȜsC0l'-2JFOJ{3z +EŨ[Mƈ)[6 NO OIh#ŀH6ag6/0\~ ;J?%E!b y (: ӾK5Xq /i/N )ЉDD@XK(FӞ/Ǿp]\udg7[9M"^2Q˰P2:4X aF6aw >%F\ϴߺKd&WsΚ{X2BX&g11 @sc^ .v>fDJ*B!XM9s2:WV+GՠJ6F,r!뛥d&\?P1Ʌ?jLyG|Z !,X9%˧Gq7:┬(%Lv@=רǰUO i{ȶ WCznp;s6!TJެRp`'5GCYr:r*aTEncjSN0+f/2/)nkx2<@OmY[=9wD; )Ǩ!0k)">@uZp\KpKVA*z/qƴR zDy</I;z V U),sf$ QZ …ٞDR- (`b\x2q@rs;!P:T(wˏnKDwLZ 1I#"Gyfql \* mijJs3A$r`@I Xba.2 4vE @l0-);EE,Q,.mDl+2ĝ &Q v(Us' }˜99RkT)\sHcP9Ĭ `>S9@31Bf?CsTK)*XzmZ!r9 # @Z<`>.m#CPE+Qnۏ"(Rwi-7 Q8x=)~A=D } !afɁm"zXJQ[lm3kPRŅVNQpTiYo'eR wwj(H~=Jq>?g_s`“9or'+!7S،ia2mXm,wط~ @˃VRxx/. 9$v̛_# DC#j/gupH}!]F4ֈձ}%B=v"/1ڶ }{YWmtOT<ѽwO>ij,PɆ1LQ=*!q;30vp:'5!qMu+eW$9U6?}I)FeoyYVŗd3{sShS|kC*ik+i漣B#{={A)F CP=[fS@!1XSoնF ;{7]ƣc~?[,x&&{,Zqk^,-=O.Z)BO/{3 *-~s\$1j BFár.Ye^y1?:EaY55j >׳@Fjj8K NgX_>^!bϸ= &}yu0ru0C zG&.#xn mm[cJwN4Ǫo8ĉ36Z#-'EPY4\fxԈp;}-0?=\;V$:X5Ё{#H3$X~*Tclx%D\ yu HsR9=^6hb/;4<Q#u̔Ov[T99rK휔_ǶIٗURC>3=thh8l8N1ɜӃ]q@T#ik qG#-\]6}#٩GCE E)-ՏNwOai*:{kУV@2D!):r,ńt>Ec{ GW,'^S^LWx3S\xgPQkW&~=ANZ"CN|A`IBhF8r7#0xaVYrDҫǞsq-vMy`zy$E_CTi鯘A{R{JU&}(iȟ\EWʁԜmqo$vW3eEbř"Jqկ$6}H6oo\/zrV(;L<#R~b J6C%]r) *CUThQzx٦m54z"KJw|kS@R&B.C&0dVfYέu䍇QH(t1ؐ&?{۶ /n/ ;-l:~I`VT] )K"9hUAS[92Rd*ZpEn-]?VoɧV݋sDT?:}N^r2_l1ԬNjd1yc׭ۗza0|]RN|FxKcE(ϾqIͅNR7S-qxnv bt)ʤP0r)!XqCIqH@$E10$n e #1CHqEb$ۓB8hݺl(voD6.-.L]זW{j2|+1ջ{Cds]}y0OD&YrkQ6{\3`)85}On`;@Ga7 fm.ƹof27^=o/ʊs4 pnf6 &rp][µjH 8N+qZ!JF tfk1#4B?p~g8?fL}3 O="L b۞PN{2pLeZG|&XTp_wp!$؜X8X>Av{0N=RRU;t#6' S!2"0vOj RSR A6i  BC:@G1&XIx0D Roٯ0(!=HTgKIc} o^Ů$"kW1Bo*_л-c<瀥iј1bL(fh"NBmPBXP+cK9}hC K4 Êr` uY2ĕ %>9r֐[y?m-1x站,_˿ܴ+n4Xhg`.f{ʪEԦ\_Sy/*s]r)Ys.5TQ[O0DpHnN6thˬh}>n,5sْ5Gggn({;4P\9 aҏ8Kr]|Dĩ|N0sOǩ8EivŔ N<\s'L8KiTdm 7Dm~aLҬ)x' JbOj-,f#o4V㦑yЍ7o'7k+eg3Ըlsކ N֩OmV܎ǝ )wFœ UuCXE nx ѕVFTf5!ZEA7.j|0K0l/r`zu(F. `F!l.~H)lSy@ଇV3H*+4B* )V=X4>H"ź(9_٬%˽WopRAI*6cnʣG7#gUEEC cyF457YLiHqAC %f2Ydtӡ,~ѫQWNf)aD؂{+vCYn? R]l+{Bd#岓cr\>2q&JI`}1߈9PPR9 aAh4qMI]J1p{ěQSy5?$͞/f^<?yˣ5޲Uo:M_ypYd~Ƌ2T{nO:Ի>M쁟HGn7Wc=K~eb%='2޻jdʺzK\6L3Yf}R 7eϋ%猌%|SIMNBz2=L^Mɦ-ݜg:Q It, X: '. ńғ;\D~ M4Z@d[Go*!W\ug2p}Fb_q_x+UXȟDl)T87@jbQ yHhI(gGO0[h_>%@& ؊Fq+[8B'V+V;Pl6x=ѼM)v1bT L \!B 9YX&cO8aJആ)W'  Di F)L|da o)mW?gƶX?OI$T`ShB.8k`}a([·^S؁2dqsZN6tLyc GzPaf⩠XF,\Y˷ j,/4 bX9F%mu n׿Ӵk$EL4n>^2^uEs4l5:Qh=?E.ҭNa ن䌈6pC'Hr|4{A|t/p/4d cS̒]R_BzE^:@Z%DAa\ )AP nP 7c*v1r?!PHxD2Bt'?iV@B6(AE$W;3hU/Ae1-+\G ;=|2 "}agA l]>{.)%)X^Eɬ,$˽p2^^}y`ڨ~~`hÔx:m.sQ946[csWi>5g{Ҫ6F<)U+x'f2;:26hNh!>w=cٜ3U|~}t߻.$K@} s ZL`R7ZUq5DWZQAerZ+biQø/>D),(IQ $ DC1qRb1H("8r a8"F1!'}]㞾=6c}d,JZ,7̣y܍oŸzm^V!ޕ7q+RUޑ`EIIv\UeRa0g.I\[J=.9sK% 4GHd%6s:ZdBvV\p3T|ۇ>}J/56}O4l|}^wyH[=w` )-Qs[ł]B>of, ;B)*R} )HPގ=|&FxwHjs y @D 2w{ouc!1k'Zib+%_BP*_O$W.Ȯ-g W9?e R1LD `jc|Tr&NZ ">s Ts>_ʸ(46ܳP ^yRLj"R'^XXQ='¡8!0O^_l:WAQYUT3cW7!!_֭֒COe `u;b*ҙuMhQք|"ZGf'qn[7GnuyPDtQƺ3"g۶n -jݚ/\D+˔vRX8!JcUrsDnK4*/;rupSXNwTr"kWY{+9O ]P9,^YlxQTS ,n WrfOwit_ɩf^w(JOkjC$mJhUh~*qaI<aH] <%u*N<++LRH9nؚ4b'wos}HFK/6`>Y;YU@̗w$Tn-6#Iׯlnt&'G"at4F$ 阄ђQ-PQ [)1taXrt- 08;4hd47Vj}=i|O?Q@eJJ8yArN65GxdRH_2ߚD\S>'Ch8xx,%ѳHH w1 g ͥ&[R=V\='Kc8^ ȷ :`BbPf湓Rc6\Q2Fk6:j%́i㥠iGKț8D9dB`T}7G`xk NY+d [18i0PlsDf5  ^{=L921SgWИJM| xf$GghE,A\0MV>1Z+ ^!'Zx(Ĕn.kUJ,s{ $1ڻ:\\-9Ҷ!TRPrj%;*3֬k  kAǶZP'k]-?RmpGԙi&4 NϝP⬀䔀̤Eб}Suj"F\϶ǦQ9)(=SV\׺#J I%?O",WHy @.WhI'ҠcA5|ħXqCG|a3;;O~5=t}ݝpЁ< ,8Ybc"OIi[C3=^~x7ϊw( ##b .aDw`GOCTѺazf5eǪ%SWz",D JW-Bp;B];_|s v7w>V,jl,C>'9#XDC.lLKR Z䒘`<[[sB2X8i KQ q/ʀLd5h̡}P[oTWnz&`шa$r; /`7UVaǻجQl Uzy4iv.W (A'\V1Un0JhyTİEH`z~+,%"혶 a[c1Z1X("onB3h~ ? hfc61nV*n Jf؛Ikȵ oOg|0{w3`yz$l~PLg~[_ev4P 57C4^ߠSq#vDqx~;&lc_LCjDvm:sTbWCB :;%-!#n)4\kHgI85i-=еiHYkP%OCM+R{(9;ulpG !֑'ĖX:>+Ua2wɯ].;(2bn^Pj,HǰEB:891HK"Umxg "`!xcEdsA T|7lt`g7/ϛ$7!0nq }NFEwup0w i{P~/Mϔx~+MYuxǿOOb)`"GyOf5|0:zތٽi^~<9l.UkcNzhp`aM Am c!50/qG6N{> &G4"&xў1v{>;7>$W2 TGE|\njRnMyd ز=Y9ߙI-ϣje$:W6r J=&|YX9y0ʶ*L .;Ծ{R>%uX IŕS!ڊ+QSfi=+B2 AbŞDY*Ę6^81H9i.4)2n8[E#nge"f1XÍdl;^pXs'Ok&θZ47{n iFтS@9m5H|bKa@%m(0;#ףےW,m>$(\hPq 같"˗5viI2+CWV.| )T[L.lF,4 :,Z3d\-_{߂:7ue f!U/{7'3'kA%(:1rR3(i}ģ #GJb8In^»7X AYЙn+oA?w=O!]^Q]fӈ9~ [%BB)HQ,K֫Iȭgpa{AHR߻I%dud#]d$0R.4$Nɧ ڶς&yW* ӟ,zKӁo&!=,kge*? ?>>Z2E&Je/p??- Ayvh]yZw% ^Gq2lK-U3:70\ntDN?oHe >~VY<}ny]U{RBRxo[(oxyc؏wES vFT>RXsђ·;Bөbo(Z:ARCEQ؜JI'gz_ kat-x\Eճ /~drPd8ԭ\n)n]9=q lwvc"Fjpa[+nrѕ EK 0p$BYqdHRNl| 8Pb+B\V;sp.E^r: Z֟ Xʣ WZpyt,@ 0;KbNY/$6 x`ƘsWB$&ak~g%\b}4gUR(|ϳm : - x 1 )V1A=N!Yl-M,i5 PUP YEui8Q/p I Ֆrgې!N綣cu :xՁB_1p32T`.eTMPcJVjz JVZ!w=|Ƀ)U C}]j0boC ݛ󔘊u{莮NuwyoaKXf9:xr^ Q;!fg/N':i51"DԯV4 l~m랲M0k3`$v u{,w|}%6mT#um6O+dG𹛘i:T e_g4}tNܙ6!h%܏,"4܎0Ӓ6⇷%/TbN RQ/Q[mnU \ 졹Zj.ZW4cՀG;̚3[3k)EZu6kz [V@LI"eUgk*W\/TS?oTKn{6ygc fZP49-Tpⲅe4B䲅2/]5A\at؜=g@_3rw+r}3gq&bJs."_%H_.Dx"=hŊT φΌZ7Nާ)^ZQ"gŰY}z1fukeX\nvk%%[ - ">ux kWk bt+n8 YK-,UtnAF5wv=p,ݯQuWM)QX= ]-|L>io6t.l=IG'4'z}j׾:k,IbfHȤM5B8J_^濍_f,~x]MnF}('W]|xDbdcDY%^~0yVu0?8{:9icWf_zy7;t;_¿A~3N6𾜌@7-?g ᢯=%R8"%2dVza^iƭVc\ 8V\&w"L2w3V'`lz*B?o'ۣyay4tss;olg aZ-pJ5@F\\i5Žk_fxP^Q*U>:4s&itVݗ Y۰ bqqL{6}^^?B}Hv~X8ŘYEI"0%m$+qထG>a3 p7nڊe$"VzP%l6mg5IWWU;'1Ԛ,MDgJk4(I`b AJ86[`iD#N ܙܭln'E>NY-xS{>a\(q# raqFi¨RU?.:(Ue0nFi1 {FeZ ϐϵN.WWį?. 8耳BfE՗tZ/w_x`$w;3TL<eAS|W EGL K+xEg?mret܂F,JG=~?>$;_>o/nOG-)u`.) ͟o9Oͩˏ0a߼Ow-Bd*3x 2,/aT,KW|!Ɖ.p_,nS!{53!WwuGTz{X-:ׅ{Sg@)N#:W<|g7\k^00.j,ԅYMef5f,$ىկ ֵ7VU*oW|e:\6dSCfT-.3~ȰD&8b%*$e0%5.Tg)!Q|nNM~FRyor`*A+-e./%}˥vqss0.#.PL\tF9=ƌb/)Suy^Wb*SuC'D$Rܿ|I%9+sɾLP35Bo RR;1Xĩ`lfډ];Apň`!.zKOsr ;XIIե3Z7!E/~4v2I/_¯Kץ%~fZ)A%Q"Ę"v#lLdDp~zg$֪km >~8rJoxY:F_Ԕ2qĸQ1)KhaiS"৤XItf>Ҕ2樰 SRn0WR6S]62KQ8gQє^)YEDIqU%W!cv˹ZEF~,n&6g#ş_9u`Vkn[1)aD!cR3.MV; r2M,;a(]3%.FK7o>X_CAsN 74Y +A?cB8%28Hf)`!Hg*l%idAL8$h$!u/T퇱@.'ct@T1ťѥT!}SN/v) {41*eݓ .[(|ث3v7{ZNKm4-P&Ufqe-q2K3"ΰ%',I+T 4}K5e6nFnMx&Iy(U  ,'`^7\43 K$h/ !Bnjw&k_E_sr,BUSwg :H0$/{_v1ۣ7wδ(ڃNB'c)>baoqULdq5ɯf˅fmW 侤=+k^%e6*곉a5vt"G1J#O4|Ҙ*_Zg, 1UBV_HG; ~NP;rT vZ%|Y|:{M{c]y]+WNb7Ln7T;/ُEYslXCXN=fH7Uv\LT(-Jګq6)+ګfqR=)D\F2V->][ьRq: ,ₑ,:h4aF!"5sL̨EdfVmZLf,6qFL! R2AX4LYb )H˰E! Q`vlNJѦ?} a9=V,"1%49B:Q&2BYpQPKty[)#* BWdxFaL9ؙQ\00jL%v+Gm*-q03fY&ֱ# `3sPa̍T0#3$ bU68aV0^$I @8#(&gܠngLrtx}aAslC]f8v'Ptdq,Y.nw&S65th_%x#wJ=*ٝ$ *[I *<ҿkqhVA 0B'|⛋tz &K7>*GP5ʉ1 |ܰ1"j< FtG1RHGrŏMҽM3zϾ>uGO)[泒Vfsvk,fP qhW%#T|dli\!&c»q,֎ Z\"IWebcY ڇ lje5Vqbx0 Ge2s3<.FI'4v7{Dw̢V-ԏ_0ZG 'Kw!+Y]-yce /n >Y(Xlt9db<&~]eAFxCRĥV! iuyR&c%р4x r.l*xRG3Np,-PعsZ*DXT}'S)9DI&8$sNHs c"*C"dPUlVA:pIC`(YV1E?uC0?$V{I\\ENωTbg{@+:=qjXp;z#԰$=LWJʐ3ӥDRxzi$B(RfG kkT\s%b~Tc:fB@pX>Xuo[efA-JAh1sN+i mFhMhb?bO='a(iX1!ﱜ(Z\8[B+7GF;QBcuPXpMj]*ĜX;'6=69|'nN '#i ;SaJ>z(S 1\F0O8kp%qHh*VZ:fd:ݠZRanraFl0 \+[7$G}Gt 9^LԱӾpEPlų7y"U cL~fsH?]UL h&'v/Zi&C03to£A)2=!A 7s/၍#w,eNV0 c'گ߯n\܉ u* c DK?O:s'Z2j{$ZGTZnӰ*ĺӂcaUmh~iqʍ"V@>C]vB 3ꈽd^ݺ1w:*ijx|P„Ta2Պ0kZjc3մ鳼 Gwf,ZŜH@ $Re;y(sKj \(oAm'7~N?=ZuN$!h'F ǏU$ѳg442|y/%8jwз \}Թ~x-j؂= xۇ_ 3$c̎l65@KIPnк}Orz8\aBUMMy` jVyמp՞!ZoܕMe1R\G֏XvgWYpGC ,&vn8}hY d)={7ϗ%OZA< 0 B_ 𔻨z޼mߡ/!ioCU8 Bzd\G\uoA6,!3v8T_nt?pF?O}oD/E<[<"ߐaYX0+9^o QZ\lA@5FGb, AGQ%f[ jpM7bRtlae o<(9~I{ǃHuT!o8P^ܳ \sq_Svy++]P*-Ewwټ<R~K{%|9O&Xx,C؍H3 ,#pP>:<5y\Vt.5ᴇ~., kMyOcWߊ"ZǡoK;WM__ߌINm# gǤ8grwOqwxKh#l,q͡7gp B}@>G_v*\ >BHuݐmU߻!|+\jI@#̣+j+DTq}(ˣogRSEZ-žDЈ.򅝟u/?*O>oʁs>OP X_^PI$ݗ/6`x]k= q)Yhu,lN,/˺!<5Ɉeh:mNo11g1j|Z0(gy ߽9!"W8Ϭ7/Q:W1(|ٿBٿߌd4 tI&GoGJ?`T3Dⅹ5k:>{d&)vaVg b@V3=]ajU?nW;GF\)|{CY:p4WgN /|8ܠx!}wE##!H5߼ MTx(tXl.(k(OhO.1]ʧ}ijP1{rtGWm Jk55>=6~Վmw6 b(ړ%1Gw;lp䢣^l۹VV~JtQ)So.{ࣁx3Ƥ*1nMYtho9Zfg?W*R%u< #7w/'}pH5۟./&SԐaIh7D"11J{y͚]QǠ&ykxw$Wkc>9A95#2tOa1g{1eM+O{9iъh>EӉ{ w>4AS墋 VXlH'D5Nز8**.sտ~3F7{f{Lt XVW?aƛNx#.ݟ*.X;aliX/i1f/zy-r{;GQ&Yңq6s¥E7CI>|^{y]l4R qFv ӔXdQN\X±t<`d iǽ o|/(M솷5}y:˱G P6:yaEhq /4ZF#[! w7ӍW?_R2۴̟rO 5!~De1|Ӕ((VΛn+*E ]3RzG18Vb&I2BUHj.El@eJDcgBRD˩u 2ErT/GOI,L!+bר{|/'bS+ +Oe! 0nf`0ViUTbԱN*. q-2$N_HFZ1ebIXA:xa2I#Q5S "[mSrD_ex^{p)HLy5o&Nl18Z{1)a%&$ $C2pMXMa#bBIDa RA];TJIb@q ӉV؄8$iCnQLW#m3,%-ñxWzt.1A謄GQ!v(5LP>T҇uqz&<^Q"A\+GI*m +h%Q 4`pOrKKU-SmBP %T !ҭtO}$_M8Mj WHa*G'?Ry*3MoDZ sUs&*_8)}c酾R9oFneb̋ofe/}ֽd){G&\Ip]Аj@?Qw2l9M/1|Kq$??nID%+K ]%ħD4>{dw *Y~cr^;lVSNלa;~L),z5~ME%+`ej\=i]2X .H6Ъ{u}J vѶ̓ GGU;ϠN-3N@d D#xcgU2t=eG!>N!+Na˛N;{ċx<:M"! ϞD_6-oHfpo\d'Mr{#| HٰO$yFa˛|c(td')͊ SC~] sddVN_6A˛yc9H+1Uqǧ(J䘔J)'Ş_]qOï|jz#o+$BH;NYhNIOI=},S ܫKq:[gSNH̺|zFݷu͓"O'͜^:_*^Gd9fUItgFaJ3VTXez*hqYzj}[TZKQֈ'ՎpQ\re,[pnZr+hir N _J4TpJc7G(]є) a}۞RB;hW/vŠš/'9*H董Wn(e !`{tvqbzXgdR3?Y[cyuCWnUh6dG6O/QZӭܭeH% 31$H %i*7@_ C~.bƱIZİ=4ئE" V)R9bPOs惸jvUB})A3ب`WP`AO*t5LhOٜ"qc Iri} "(p^)CS#N9.A2+) X0 LF4pR?zո}+JEPNW%44솲[!Bchd_:EIpJ FšcȔQR\S`J&Cx҄ sŊc0Ԇ.cNehGÝ ) "A)jJ&RD_> %Ŕآ2́Tp $?5 c)]4a[ҌJm}T. kET4GQͺ6iockhL&[C򕑓X"HBmb]Ild#Tcʔ K` Uވ~W Rv-G4׊фmS h_&}>Mpsn,M78K'TY,,PW΁)FUP 'HI$ 1N`+ ,7bڄl1^/ԜƆ%)N9AD!-aiga-$E8$ OG?= 3!.$j&E|X_ ;*u NmN Iwaz)qm88(]N>z8CBODRK"C]Gjl^uJ['N:%a>;uHEQ֩1 R0ØP2|Rq#SJqJI)MC1$M6\.~zScr^A'TZSNtX+q?n9ÅsZX{ $kqz ׋x0oi棗F:{1 η'`tg` 샦X:Sϑrd+knHYLuЃCv2`,@qT5&i+$lvU旕GUf`J#o I!t}Q0PO`4azc_3陉R:6 Ց.E2ժ<ֳ"ǠJy#:c;K u0VLhv!!/\DdJ4kJy#:czSB݊ n5$䅋hL=[qT(߬9KLLqc$ G9@bi `"Ŵ`֠+ζ@ ҺŐ`TRR>Q15 ! 41iJE"A)eYcN;`qkfDŒɣWyKafݣCWʃkY:.]f,aW^Ր.%2[ j7: Q GtJEڵv+&4WՐ.E2G + tG <1vn=F n5$䅋hL=#1Ϛkږi&F.ΥlfpfF)5 @ GF`L8K wq׆vq҂{pGMn750IZK- \M]j'&oT)ٰ 1(L; PHQkbfJL:eq1IQp͑m{tw񩚽Ǔ}/n!q"[+Vš!NT п7 0խN= !/\DdW`[)rDt][QXTn\VCB^Gh7C GtJEg&L|bBs[ y"Z$Sbk,2YTSjQU,g '"<М>̚*ؾbX MELsc+IŕHdHئ_XpT-|ί%?D 連 ?̗Ȱ),Rpy7- !/\DdJȣԫMn ȧӎ0@Cz/%ո9[ۚg"w, P hX<ōf62$-,f)sRr[%`Owﱫ`ÛR`ODTMeT3Ɏ^/v]R cgpe5$䅋hLswn?=9penNO"+g"R@k)EJ)-"$FR2SXzlw*uI4\謒#J; GcB4HJ,FF8  g$#54$1*QI` Z+QvH"o˭^2_P̗bŖtX>,ϚPy秥"z2NN)u%ZW_OH)UOw(8( 46.r0p~ks^5b,Qs̀˔;b˴1{<c"Ơ| %/ \qY(`i7/ 7@}fޞ󷄎q`Lj}?K3OYfULLg9YO[xM?;80ɫ[x <2@'?)xOd.i? =f7?䯃ckx>Mp0O xퟩgLt-f>6yHSɗIgN+w\t~g6wB<02yM."> 5&>䧈yC 5vZs%5O5gaAd7dz5#r4N#0,r6Gq80UjoinɂL/s8` "_:Gg0nժuA4x[Q4fq0!:J#ʹcN#66Fe͐\n#6#`C ^Fb⚋*vd 8t֙Ͼ( W4GwKZ7u˞Fȇ\pժWf(44!ǶNT35%ٜO鰿avUAVq\wHߠ,-;SXVtf1M Ztt78riί@ LkI۔rNXx>!eTSSRr(Mn&oO P='7"(`)YdnO]7y 'p|577&kݺe㔼OS{]Y˲yIچf-kLD(KZ&kց/|oY K;r}M{ɱ@6)E a7XVݧf͹`1X3cKf<|5ivS| ?Q>v &uD 6hhcC&5aL6XL2L/^,g=kXv^ѩfkt^^UǴbqX7d $h]0Qap2ieL*GYG*=R hu;2Q" }2&M]abA4ۜ-2iR [ |)3\Kp<$gh&AYaQ8.dF G#$j'!i@] MZhݚ:eMEt,sLEr.+w\+CtI j{# u0<`06@1`bҁCfzj%M1uQ;89  CC,JeZԠCF5~KFFi%(K5k,n]m|XB F=#Ũ-xG΃!{cX A# CNX!ye?cF`WAQG fX!%. XIHELCA[gNO 9Z1I1RF!:2Or11)uPPI*ό鮚_DQAdYr׺S%Zo@%pT$ihtq/ d !%Ѓ')ҡ

L}Mz@Rju뤆Max*(٧U'5o?*8J9)iQ2 3.5xMW{;pK:րSתmPaW&^Y7-sPPI^xM{75m(>'j0}mB9[#(S( -e[LMﮉl_/CE)y=jйucz9}V6Pp˨HEz̪S5eݤ?1&eܡ+!֚eF5e5n+NfOTvƓf2\NަEq ,2Q `g 2uYnacHPk+,%pp-WNd03EOá7]ᅱY'XGMn?Fp44FF5 YiVƃihߟѩcm. aNa=S8r{ !āk\ym㙢\Ep:e7:ˁA% m"6E S,Eh s2WFk+\8ɡBzc.bv;H[euySbjQX ]u}-E>{J5]_P-p0o݊QW/BE2DEe@^r(eqz)EQB;n}@DQD!GG1<ǒY@rq}'x%>HsT1 IfA%rD&j`) w>ULlpBN)VC<:BC'Zy0/vtG}tNUdJt /sWINGȽ@5-2e*g|=S$bfiB)iKe['W7SߝϹo1NnJsd'%zJ3/M)R=oFH, ctHxlJƳf>j `JKĂ2 -tg(nU5v6I=Qa%Y]`8$Ԛ38#Dtq_D'0q<#"piߝJkc=oQؠXt4UZ> [0Fқ1z (Z竜tY=UhXY8s j$9vzg=+Z3vzQTDžU[/nܮ? {y\<{ % LÚŕ{HȸcY-K[3p䁡't)&zA ("[=$XMnjTnbod" %p.5ְ(+(GRkr}^bwW-y&f^fuj{W FHˆ~-<~pV!FL8!&gT7+m4cϟ[)ıdV}IyKhb£ ?Ba9&4&8)|ve[M6kU4k+"X$t5$ڲ 0/!"\C H*IFLJ-9\2= # B-ӍP `TJG(8^߰RzA[O*dCtxK]hµshTef=kP^}"q^b%q^bXYހS3dC&6D2Dp}Ґ(cCtH,::QY6ߙ0z;3%s??H9r{<7r!hϋ=/mȈ9(~14eQ唿rWp{c s9{2Yѣr@QȤ0L*Y*;t}*3Q\Fty+pzcYuTT/4cFZb٥'J3 [Zp˄E5j@s'LJ`DiNvE5QJȽLJj̒84VJS=xs,O#5b(OC_QO\M`DaG)ˬÍS>חeĞ )RN#;qu5;Tm^vÚFZہx`[̛ W ۹J LǹdZ LG^#cYv`y]3b0Zp#9`wC%- aL%әa:\茋5ΆrdBHk8y\-LcԚRg^GpV咢 gUKч2q(gӼ_-5)5E ';VbP KDלZZ oH&d,f`Z Y(<  ґL"ThYCAq0~_t˴mGKjxShfƔށHItpe |IG#,|el> WJ{v˙2;GaXbz+ɸ?O|8 KLa.ۮSz0WX w6֨D6]CY+p&)rQFDP Gs3V jjov; !Π`oX{9nio,3 2mXh' n`'E= @W`'ƒ՝@9VIC`}HͽpFu0gQ89rC弊(}NnLYٗҠYshgvP'RedM U}ıpu=p¯ C5_dX n {8wngMNwBskXsEO??bТb)zBYaqwzZ}W\Fo;n}IkɬЦ1??V5;c}0>}5QȒc\@s*#iB-5RAJn# h" ǣg7t8e\ڍA^}|aǺxU(G<ٴ00<&]GnܤpGd]`R'9+plct;r瓻~[fZhE.JLw*GtYJLui: wȻ5JC_{t<K Bҩ)UѿMvt}^x(($nI$-wjl~_/]6jK£T5`=k \QE2zrs1wOx`AA)koPӳOnrv9g jV8[^Z\gjaރUp5(Y7Cdm*Zq%4KT$y,!x1{U@K垐~Z'A~.Jh V|QOZ QVahc\|7fjٲǽ_F H=P$ fOOs?:oQbx.{fK5pvJ6홑 3r!TQg׿=njd |x1.ii<ߣXMS=MSh" r :X'I18 ٻF$W~Ŕ: ew0yBݲՒYodKbA2"2#23vEkUY{հ2%Gsf6៩}S^p`:t{<͆Ou˶wSݢ]ٻ_:}dzQn?IL`;N=w1D%JOwrεe5zr&c^JtD3\p♺ְɅ9&؄췗i|f2}iΐ{\ﴧf1 /AaEX,n$W:źcAK"|cPAHWzM/̤'8B!X5 . p:*BV#4 訰h:CbTPRbͻŭ[O6,7gus]gwsW!۩ݿ"?}Sp,gZ HNo'ń~[An;ޒ^prUa :4E( # 0$d$2$'Le5qW 9П`I$VDI<SyC=H>FO !H]sIa(VQd)h։M{ڏjZw{Wpa=" U^'CGTEV4ًp%|y*z2rOm1/|NOs!HBR+Fn\2 ?? 9Vi ,~ͱ y|@暼<12oEÇ @q_ev 3k9螄 |+j:B܂-)8WYY]gV FX 5vLN0M;V]SʰK32Qe7&#pؕdslx?dl$ȼXKXBdz@K$K=aqiC4Ww߶yT0'[E NIQ1l pl KbV!iG\# ZMdH&v"%m!RpVe ە99F %zmSQ@ Q!OI A^Tim=b sdGQMc.gTs3xe&P9G?lupҸ.\-"c^ec|I'K^hĤ Qe׈c殊9.؜BV9+2Yq:\'lԇ`t hi]y/ (Fe-,0DJٿt/]-)xo ,䭃ͰlHa i1$#Y^H(nFy#FDG`&rRm10_Yh0l$9c)x#Z0ijs^Tˠz!{l) 7Ie4"HT)R/phjZx4@ۣfoKd[{ZӇ>K~SbFכϪZvN\?ΣH\}SSfmsU+KcZ( zΑWb}!PT~p6oꉨ&UvĊYH`B@T2<:dY6kI%  S C*LXsC$şEVVjnpq;{ ju*&9KփD E6%6R3[{O8{oLG->W8d%bheD~sdv^Yco<ʥuV/ATs YjLdѿ٦ k&'K'JA8u@bC рx͢md-fԁ'L`1ixD}11F@@ފ4%PePj.XEhR[g nQU\sxgcxCٿe\Q/q<Av\\AX`xIVxz0rs _A+oGp^b|hO4JFh4qCD\W?V1zk/JZwKGsT5Q0#Tc(% ?@M;Uv<;E)yGv6jO'vPE"x GD:h\B$׾L O==J'x[^k2ԍ:Nv(o.: 8CmB[0ȑso% h8]t!sͺB/y@߶(|Ӂ\{JyI~I JWhYI*wT"ER\$hiVyƴ`0Js"xFy71G"k ۇSZgh#fnu9l^ C:y?g*woݖ;%!m^QJ{Ȝl.cp+[V3 U:+k`zFˇ{M|%Г 9$ $(t Y쮆9,Rr%*!m3D%A[54x[^Ṽ)y΀ySwz\W=" n=w 95{Ķ/6%i fK#Br)˼ d=~ZCM"VcĄafc~:}BCGhEF6@N:0Wx㮞_3{V ;Gvl[^_;;H}4"iE p$% $ 9 FKG:).2 qiQuT@kq""[IԵVbk&%侩vJV`5Rj4͓_>J3cyyF+bBja#5^HQ X(e0=O"3ێz뀥 m)Q|SՁ$w $\3Q" `f`j6a80ǖ BC"CJ`"Nh b!qeOe7 A3/1!) "DQD+(PS> 5jXMLЧw\|u@9Nx-aSg*e(8hr|_rO,1[ɴKueCB/!7Ͼi**`&@M&Y() V4r8ҫ5k\kh5;Y2qsHxOur.' AiTO-=Kز~Z&"9.1N+qL3b`Qװ+(2E<4,\jSD t 5Ò i4WOMW֘HԟpzbGJh؜1}JO Q'0f3_ڄ|0c DNxjN첽P\l`G#6 .3zH`lK7v8,U"ݚ2cYdͧU||_L'{z{WBh=SM8DZ(l dmiMVՙjMk#@ g~9Yw!?|71 l. M A6q ؀= #anVj#8~V?NjV8[mS6Uϧ_%êa SXM dutnVH 0Xt+s osN>q^CET|G3ZmC#s~c-53 (2ݗ&Fiم6 yh< aޛF1FX{R{R\V 4v}*rj"IqF2Y׊cF0ʙEr{l?jو=&E;2//P<+z/yA1[jBQ)dfWV5ƣA>q#,ohx']zb6zӻqs8`ڮ,CmڮhpWk7} n]ӳFox*nѸ=&c$ \$zMc>fr/If|+3*DW|E/ =Ka,?Buv8@,jϳ)6;ّ P%a愢b`2ugu_&H!e΅+վ]AH\+! ||Vl_$uj0C)!@ʆ?;\~tXC=x6kL?/0vvJc  jă|s`̎B$tL\Șc*^t>]/ _y\3c4)j,帚G$tfA }6*)O&J93CyTu,+%qgUs+g*TϽTK0D^הa2C=c9Q'eyE~E*4Mf+4)kYH=IN"+׉`6O},VWY6&`ks/w>{Qìh0}5Hme|֌`}C3o{:֬wm0}q+?ud+-~/lUPT*=NI?~rA>x9񲾮n>5n`k?Yw y |*dc6:bXKzzFL=ťSGwcg7G=ԗ=qv2ˊx+Voj\X-R<|| `^f?f'/.]ޅ/~#'5 -hs X lrwe޽ ~0#^IXoQ,Y-u%6_&yNXEVgEˌ[NϷ|CRd>ȶ ""$0 O_}̶<;W2ZѼf Vҽ]Q؍w{lL!a JP{uP[ZJ5j($8S̸34:0 Zk~:Um9{ z$ D&Lg2_~Ȧ_x ϷB{w4">.Ň .C֭l.]" L6[:?웧皧bH3+N>QlYa|5"_ɬr9Bht!G{DճޢRLd6\_t[Nu[yGR!u6+F1#AUS}=uj~V)}h +Ḽzu Wub/\Rըò8fhmۛ1o/VRumΪkaǸiZ6ên :BN@8q7'kv3dPJ =F>0jq^#?mq[%fִqCs)Z۞7[Ս8ئ *M2K~s%>V7#v*[+鎑t˼lSϩ:zNȃTIvLƚD#DKgVA%f^18F~nqlQIǬømC:[;FV-$;dWJVd#+uVN~nY12ddլu7{_dNtBX`5jC(Sj'e+%tOS.v[66]QO'uy 8]k%ɥs .}u7Btx|~Nsߥ7n^@XlPtAd&л,j.%qk2$εU{.p2en@!{;K#|w$H[ u ے9IJ 66?>ջL>ܭ]VuG(AWYr!vyu~MlNKg̀mWmuKmI(in; nIM] &Iw:9`N6)lw1ׯYӖ|k] x%vs;@炾;ܖ Y&ת0=g,~ectP=b{h DEpڞQO! !Om16CooF̰~G X S o0d?x34a*< ⯦?uN0\% 'ſ/ӗ_TL,5 }?iaz#WDULnԿ(}T&Rbdj1`q{i@ G!%rb0 )O~\0 P[/´5?MmO(tЊMOĤtTT) 7~۷oJ]ǟ& GHh%Q)dpIawF0H#Bh#5TMO{f9%iϷSnljgsy|qn9Is~E?90uc ~gO֝ TT;kp{=/ݟ4+h4 !`}13Yҵg`z㧫*S ~X?CIjŃ[JQ{ qڙA.͊Ju6i΄1j/񌉈8, Tz-cNs AФw"XdX* V=]L̮Q0kmQ<۔U^z*cm{Q IYT9$z0!i -՚rezK/~r&dǿ-X6|g>}:Zһbe67oWzM 9FJFbѳe(;\Eghh6:h-ϟjͿb 3'Ur;ki{T=o[p&(&h7SDmڟ9.1NΛqL)^E7q .SiKlV5La C/vfi t vW܎>MO,|4}?T^lS;o5Q;z @ :~Rl,O V7 5ݦ$ %'-qiߎXbia2{͐¬U>[oFWՌ _Ģ=75DJ^[:`)fi8p‚20 V Z#J`j,) tL)u^@SB;2X ȔVaDNҰkVBK"1aMQBQV4t^;~]85b##)ubЌq%Aq ZeH  R!NJ0x1p(FDm.m#wQaQ Sd +J`f \cn18cf4Q˯L`9 `eHHL-)GAw?ym=MN`L! _) N D1EĞFb l)6=GC@DS*_д EiFL}B 0uQ(6> 4r96&@L8sto*rqˠQ<-}LւP)Cy!=)_OьAa')5;NvlSj`mfriiwW3 Cp5yOROvF_ ?;E'?=\Nv8ዿ_ˋǿO_KW7w'CWΪ3J~y{Rh<+wAx\M+z3Oc ~v!<:oIVXP SE\Hr2;0xd\r|ҀQ5IH54yŦ03&w Ʌj&& TB=;7`Diz;z( ^h?)MNGUn2 ™H\o~wV|劉(EހMRՃ;9;!S׽/7}ru zû6GTm)oʘ_è.ҧ̎^ Kr\-Mz?:FDd)c`:سhK</Of| e)zJ+3βme]>o[+WbzM/xan[XYҎåx/;&eR\wul,JݢGdMfik խm2H"smN_IwqJ+G-&`[qikOA"NK"c';76tP)LJЛKg!aN&W+1"WqDgŌb,R !s/|P O#߾)/fiQwr&t3ԟgh L/}Fb,鼚c.$-~jRw)e]2כi.Mga4Kߣ(f^gR?৹= 2I&jp| }\L?nԻǏ4&j I֩8hv4CJU/F cXc(iNcJ)p' ZG-SJ)IQ( @9oXic`;?PAISCD1O頧[7"6C8ӓGRl>=8 h$Eb"OX=J⇃ӫص+KlDuݢf]Gفvޑ~PW7p7|ۏO;ϼ5ô5iӵ8G&١h|ӝ5_͜]s؆D!~aYW&Ib6n3[]+[V[ī+JyG-nFUۯ;o[X5*niL SD65sȿi A/{쩹n4Cu#;B}fۂs}WTkзr/.LLvLpwwZջm߃G9h988\җ!-"XFCy2sikqFDb?1{~y~#PHRQz%&jP#M'0Ѩ2#}Χr1q̽J,ɣt ̿T`{;&Nv/bw#no;4)XȳēLW7<3K8 G?ttttVfn9_d0b{ɐ1DY)br:(8:Eqч;b2Q&T7#|~BIۍM~*Ԅ!bW<.BiܷlwNʄ%"lX?IŔ+ vAD.yn<%0Shʢa L<Βu`)USL n E&zd +)"} ==w@tDʦ\bDȠcgZUS1S:eKK04:)*Kj4ef5 늕g)tp˹[e8OZj+๫c8b-5D*r&H#oTKWuw9tkn0 0Gq1'|R+-nmD1 kE7w$E =h}]CtZ`K,_O Sise^|\)3^J/beB|!,eRvKʆu82cy>Ej y-3l,`Oe2cXU)2YN"KG"(CUvSX^% a2)6*qy! !8uUj%¤D<@@ϥΑ=ް!UYO"QHۓ-Ɇ#=Y߲#7鱿%.c8tjo_>^iHHpP \.xX,^8BUR=$={-=J&cRitz@22fϦf=hL>yeO (Z}k/g y lxp|@\:-hmUJJ*R]\E rB!otD#{|d,靌5iDDmitǏ&F:va{s>,\ KMGA$=s@T}a7K 3fΣDuUku/ꢹ6oq&$&D\\z63+ce Q]|YhQ_AcJkbaJ2Gך*\i]PNVnhnI"D%~7%FxwCzGqфy?ԗn/}PweyL9!;fJJYB$j:a(Szo:|w,=5}уm1u} Y0Fڭ!y6x$Rjx`2M1Ayq_Pޭ?$:mJ!Vk(ΈT/jz|>^&e,^&e|X4eV用 r@ D L)0R:,m7jB k8 t/9: t)Wƌ\3оgsm t`u=cjϡCM[7tٮ\V10`tV. =iIM'FB(!ƤFB(Bh%Jc2#R0D݂VzY7zY7w N8*|@k-;q`(B"m ΄G&g*(JnnH+{~xaaIG䱔ziy||9:gU,꜕_u味[G=gp&0A$DhRL;E'4$\ł$K~vVvH)JPkDKGaYtfi2#LX8Z*$8)K,3B `Q 9pB{m7jm7?)?؟< eIrSti+KKpXcW'(6A;\S>BS d=)c2B &Vdw('M>S  DPjT"pύf&E1 ?‚DIA[bP7%A?i>VW3 M7Rٶ "94s dŦ;zKHS&+v I[HXq%e MwŚ5TLT y"pX ſs؃@2 B,`s©2Z@`=T,XkS9&Ak1$DS6bR|}lWQ!Nxc&+L P+vU€ϝ5<`\AV:; 8[`UdyFϪێ=2k KC<׎#I F>s%8YvJ}hP[ {ʅy"D/#A*ϝF< @x .Zդ[zFr%޹a 88SCDTCS.4)|9\O ARs1X@\)Jvq;Cf>fMM8⠉.ZAyx׳ 1 0?΃a9%E~3}ȃq=J&<LHUǃl&<9.]`'phsܢnM)V|TDo`WQpT(gapcJh '` Tjp qq F)BARR$h)=fMB*ǩDSSe_wϑG: t\@z 237{ +%S)ݣb @OX'B#һ+!X 0Ñn5ăՁS9n`5̆$b8B(JrL5k@Ң'}R%(v*YL^TRۻ+Hn|U07뫯COT}E3 @r{+@ۢ" ؼL$^2ŽͮV~H禘DUGw\"se.A}@˴V"_}n-s9\)ZB ]}qyʔF_3#n#Eqr7SW|Xwv=M讼uk!r2@>7Cg,p'I.XdGx,?R|v} "=UudittLtdYa)y}%NJZ>Xȟȟȟȟ24Ko4P:,~b)!1Y-b A(8;in -]f9z?əZbq? x,:KM4%Up$#,JڅќBKӘO|,i$-{;a4LQ*pO,D8EDXEss[ L|‡ oS>>hGQ v/,O4uXa:Pvy a a9&KVjriuob^\,M+y2L{5RWKLZ]n^d`܁pAc4R0"}>IpCzAP//VD~yV=HSs^ڌ3f-LHfQB|`%/v-?:*2CB H VS6m_M? s0׵awo MT磜qO)R]}+Ž6'㉾itW N~(ʈvL6eanSQIKS"݉"a|b|1]׋_[  ٷbuo4B".걨ɸ|D>]o 8`Uo}6#bs§iXw'EruxH]noLF wEDM01#p?Y,﯊G~<Νs<3AQx j!1zq,/oree ۽+eᱜ*UoNl@B|ZB4m O1OHx7$3 y gp`^Y}lWg % `NjJvk2 }Zi "1b?vԀ[Qd1 Z'"ZO Xo:ob1t(b!pL_`#H&aS8F\.ܑ>f0@"᧵[c4Kl`.PE:2ym݅"@ҿ p4-8/ *=x˯eSjpC}qΗ<}\t)$矔"Iu']\^u헫6ir%Bޭfʹ { gfbMّ=,R SA}+%HLn_3ph]m;m Gܨ;&fɑ*Q8_ì*O9K>dj́x@otHDMwg{ԷQcuQ Ø~/Fi,ػFr$~ | ]%_.0l;;bd^)C~`bU"W7E?,w`fs_[nLoS,,uHr}DcxCp#۱ÅapgXHH!-+,Qfj²n%kl=vK+[Vם/؆i'W v$ NY) \CƳd@1;_wQvWY:=t03J8('{. HQkJBtBj34~?3n4#sOswS=&c@S(̖AB.:3_6:x޽wm‹Xjhj0z/ɲG˂sd3:*ȍ PcI:ޥ #$*oࢸИzgib#`%m+ wHg \}v| )fi6cyuD7QMM&cBpDb$qa{x_@tQQrCB4,EIxd/?Y5#T{mb+% b@R]؝ ~H?Sܣ%+ e4x^ZGFZQjKֹ,rAd(Gak9D5QTqq]I\4yņm3:r75jMQsO.^?mSe.~)i",czt1ZnS ƹ=:>џW7[N'E[2<VPw|>W/t)q)L@ ְF+ڌZ7B`s=&-"[Ґ7CN1QltR$$72) ETdӗ yؠs j>8Y 1(rIq-B r4Q&Dct3~5K ƴmMQt8d xNjf_ŭ~lh*ȱImX$=pΤnr_"vw'h]:Hn}@4$Mg=/8ƥBp t\iA cc1* ` rb@f1gsЬ 6Xj_u/o^DڕX1f nӅ0$9,^9d RS_tm0'VqUa:BR,9/-H uN%PL% -N &(4%y*TvdMĸcRɬg e0p'$ "P@;PP1rX`' Pʖ)\ rĖ.WB!@  ,.J%yGsg!*#0;Pa&eî(\#1E,* TK9PPq(-yl xQڀ:SXr*Y@ÂNafӯvY992+O`h/gTő2NAxM{!<l)=3!ykqPD/{P!3gNw3ӽ p?-!HT%pXz %#<ΜNҧq9>DaBW9)aϕK㬱dZYQTHZjelT܀Jɛɵ]iQjҿ V2(yvI'0" h` {aVt?>\u=O~eQXg'#JeK~qȘKJ@>"HEԼ#H/wȺ{ Ѭy]wRh׿|bi~xLno}10GwWr9`.F3_ppPːl'J0ɩb4]{Okp D& VVt{45f וCXgLNŠ-7(+ \^b0?(gg&Ͼ5>6r>ַ$ziⷫ&ƕ5{몺Pzsubeωf]F-/ ug,YfJDV0\ԶyIn8Y` >D%cgo1*;sF6jrҾ^o&]<22ok}ZMY lL|)O+Ɵz9X3ws;_~ ?aYFk?\|kxzVJ7Y,vI/ǕuW Z쫼 %n(;CBڶA ێ),A:FxS\Msζrqy2'h"a"T+D~u2J&!#~ic?ʊO"C{Ncwy8Bzu1N5ccD5z"!Smz>9C?PŒr<(˩*2tAt&ȕ&e㮮xfE`ú $#TڔoS\Pe/( wAJ.=mcQb;廷PjRu$wح mTˬc f#wؕ$t t)B=sb[96Ʉ9/2{P2 ${lR swDl(:xHDumbURvJH]g[*^9iz|-R88yF*3F2Yi}r568;4d#dg&YsM4:w$W)@ɹ缃dI|)b C S6ݔv$B&dAߢ݌Hv$3IE]VGe%g;߭RˮFBF-'DW~NM փC:/>a>aL7!,W&(SP|IqiFXr*rX+hm0sꋒ S_~+A<& l'#AJ H b0pc0C61X lҔ˔4pԜ#n3*la8R+5[C Eq_P LF6ZRB륗aFOTT@Z0Jb^Wn , 1ZP:[i\3\!JQ;S.GR(` bxH. e,YdUabgF!LT5톨Qk-7[kQ) `5X`(P?w\p83F4#q'G13C޹5O#OU3knOĬj~6~LFgC5+r:˅T o.OQCP<<#AE8RUH=~c9Ls \K JBzc%<b1Wco(RO%V&cO%,(xRPMC4a֠Qԛ9 fXpzPaN^ay(e䰁H(#H&&eZO*F5Bޣ9J»ʸ1uKNFwtD4@Xt=PcҰ-Sجö?@̳d YKQvs8=X^?D ޗA?[^%(i)h9v$~w)'%B@1nW`!Z.ܾw`BokWsm kL[`}-`>6G"BSdN! ܰ1p`bKZ0Ai}  01ܣ.=x]$ `>9j\ގq Twncr$.!YS-U3:yTr#pb @b7jg`b # FCrө ykzȃ4{,/UU˃'ͮ.f F.~_5Q)v8)*:[2ܱn3uKAꤾ#ƺ x]-y֭ y*SMS+ծ\s«lj3^{F>Ӻ|p0"/>立˯fv9/L[灕_\R 9ڼ /?FyJn# ߻vԍ0V%%{KGG1O#Lv3 <osDt,;5dž%88[rgL^%sP'rsjtfod;zkXܤ cuYəX&ʱQm"ڒ X+B#)C,6qTtٸ'kܒI슅cu6bYݵL hHOaH\$I"|9Knd;lDMq:0;_Q -/)޸ [qը)2ByvJ2o711.͊.9ЊB>1'LEKA:Wch֘0ZF^5cT?Pc<\0R%%}SFud g#Y7\Cn5d(-9=;wE,fviwX5a=+*p_bx/ph}D߶ )xNjP.䯿I#u gg952fPfGomy I:G>ёG`kUG~ݍ:5/ŰqEzZfӨ/37ZfzؗuK6L6ɗT u X0egf'$bfJV,]g+o~?Iחvڟl_w% 9kIjBm E`"3:߶Ibu҇5xTϨE T!NZJv #kiJff1j32!FIiEIl8F$2H9)&;k%fnf7቙uq/fmhx[8KܕyH3_fÃb)B[BDiɨ|KjSQʹYzX^qzkSLxwQ5jL̟:5D4NWCLr*(4+wϟ& z WA* cY?/\s. /ꂁI.C2H2)h2ДZ=|i1a\m8}:b aݚO#?RR fMR<4CQ %Ã2rLɲ` AùN#/lЫ[;u擡G6 6@ IXkH5بB CpRZ%htQJD.wˏď0ktI N8<ϼ D騑QF^cuV'-1Sxն Gc6tEl5!Lþ+Q?5LzˆT 8Mvc>٩?drFOK?Op]2n\<<<N~22Ir: dtI!h5s&aQ%`1L N"̫RYȗ1;oK_t6uQ{a!A9^6PhGoi3 od *JRJe*ZkaOA * 0 *hmdG0͕{cm)!$rfjң:%BYJXMe-6F 뽳yDFB&(C88"BL$K %RR:Dc2"Z ĬTB`&Q, 5D|^<'\繟oJUvlӃ2PA_/Fy83j4L={v! UI]ݾ9YswZ ;̉%oNtX= K{7~|=h9^?__yVATYwj 7JT\aL_'Ȫ48[Nh'S*DhLɣ)4ѧ|F4wv)/"k[f ޷ 6sa7Yode!偮m2>w hɠ1 pZd!,ٖ)]0OڃL"k++W6,d6a4VҜroF+Jm'‰RS4Ȉ˄cB^+64w7!!%( {;bAo+0pȒ1ZxAy荵2<ܢU` Qf`yuאۻu@6MG'*VhږtieH:|GKLI{6ܢ`sȼr˩>)"Sl4#4L.3ۂ<[fk?kv^baLI4~Ulث!'uij]d\1Pz B2*O%`M,iQOc G/+nEBH=X¾x&UxX\yB"A(DȼX'҈<IC3 =@GB_bG JB_F@7]&ZiYuZZp;EzΗăBpIIQ$3\Q&gH ̂ci]f.M;C4@YJQd":ʭTE#wSf*&jE6edQ!:PP|-VQyțibM H Uj$ș L61Yj_.PPbsB1U> Lf%# ^N/>?ccD'g=ctU~{-b+;-No/.죏[[Zdn3WE'ɽ );gη$ODZE%h qZ`b0h2ȗ܆V-uеɝhﰕ&E7r<اfmo@ʐ~OWmhD-xĢ(uwuSȮIKx7d]W9JTНי;LhAEpM rpj$o%&bSHY.J㗝 1*8a@(DA<Aߋ`rQ|40#I4W؋@8^G%h YfO1KC \ ‘mqݜQK5Pm7p㓨㴎OVh&ЈY*t ǦIĝ:/oe8IF\\y4Ψ9cmvbd]|,0&?zO7L dĀ*b?enzh[ȦeL9t߁ Ӊx.(`oxK`Fm-vU#6Lp ]Ϩ )*ӄnK,(pOvT%iJjE\KZR<{xWU7.d^[v>}~t?mA|14?d)uÜP"! Ep;~Bqȷn]Qz6䑰5fwu(/cǤ] z$*q=e6OOq*}0"'*OR΋q'X׭AokqwE 51(W/Pe%Y}*dM_w}f8ϛH ,0G[+ev/FZEL߾˻~M׬"Mr{īxydU^תj&8aqB/kͦH+wyDaxsvzZl8[2߰fr3c5K:7<VM J y |MK7% z|Zio,T~O %P=ۛOכ}'"~'w|+ykfռn4p}G/?goYf#ya/j oju!Rn;is*N`;RJƂF4 qv@uխ=/}?QhLzk8x18;QzAbvʕK微][=@m <_ˮ/ v}%5ǝG4f)^|zZr k1}y'MhDRp3^1ܗKj=u$ x‹1/8ZaV{rg$tTm rxtw}Om݊'<{Onu֚] )L<3!T;əKoN&~”T v3>mhsNNwFeUPʂp[k۾,ϼ\L>7࿄ZMVx5:u0e*{tՒeNNO/lٳ|d]ujo-miNsdގLLrNnnSfε{cW##3# Dg0q44R޳q#W}9=!HipG{/{lv-q+<$cFAyFdX*#kcd{ܾMsl>G37>m0]J BkLCRNtfު ø=)Qįes8,}Y]^qh e2>DVE"Ew+PhF=TmSҤ|H $Qx N C74Xa{K;ihpҾiQ-K%FEa#cmdI;}q2D{)ho b4oTP"ڤI蜄$jo%UgQTTl TubmK(uc"'ĠNlur*Hi!@G>h/rSǩNه7Dt" aٸ5%l!khLxh@1Ծ)'BbvQFh-&>PHGX(a47*c6Mw?w|/u󭦊1 v+D/=KMܔ>}D[7_hc'?|&|rs.~,7W(kHK[)tAXcM[vclljͣNy8/-TFfێ=A/d *]7LꏿR# @s\+o+1zݱ,ȎDE2PYz*SG' cNE'7)(L'Jp&?}@=@S*%jrr0(6*mZ$S؞Am]º"yfA*cǙUg~F!:*`Q)7ϡfQ7(&՗ ?4GE@qlNwOPҡrBv:>c@K=lR<7* :cAc!"@ډbIqh#L'ux}sf'x]uE>^-Q<2sO5zMMbTFEQ&-RyyЍ |oVu' 塟P*I1:1ksUM&T%PRybyvB&*S#aX --U!-"7ײ4KwQ!m!HC 8ao"iX~QDɗK4,eNC/Uiߋ8VYk 􍞊ۏJ}7NFEIfr9Qx`T|̆`]?L`$ʌ<]w"i/EV³ITIa- =>]d%)*Dh7d;6&mRj69S;BSrW:`@DSJAM%7TjZ͸f zX@5% 1ƀKDdQhtʺSQ PAonrr w`v](?gZ+BoT;+MhB$Yac6jAJ`eZa@Fؒ.kbUЮin^i(np'-i RI/ 9obdV`_C%'Gc,H)a>2(lnA!{ɠHnl2xK-k*|ZAYX*En\Aܣ;O(_>u՝jX6B~;Сuqox; +e?| {Ӻ;l4lg֎ ]oIe[տEwFfNgo-pḙ~'򀭧ޟnxsFᇱcT)i:`%)m&*>xft I@Ed塝rxYOc;k);1k$3̚Ȩ 3_7^2kzkS:z 끄S$: &a2:S3usc NFD曬#hU=.MG<Ss,KeRoC`HrHdb䘔:H㼔 TLuTygeP Z8N`w^$`|qt1*Lz@6N#NSmcS7> E1R5uT xU[!`ƹ 2B2ZvGVߛq$y~ Ha˽ }B_]~7a =˯Zv}5pVH-%ñ-YYJ<i7TB֓D+<ǫ;B=Qk?t|X* D8%{sSn?|ewߕfoª14=M<c{Q`\lj|6o7)V{l-^'e'SWQQgx-:BEl*#޺NG'kG֕{='38͑t~n[NJKrڰgk8-l+etF>^YQ$|lJdyaCXS} ~mH׾ tzܥE { <U/D@J5 +f 8A.V&ML2Vb(%$b 5z=$kQel4=CA/:3*mt{ܓ!Nָg =ɸ'*3!^tgkp) aPhZS<(H %Y40^qۢR\1h1lJ@;GX`WN=d_qpraj![L_5xۼ)+*~{}s{;,Wn=x Wo?0ah_Z<_TgeW^^6O紏reYp憐>5(ygL=B' o پ򹳥7nۘTmJ֏.e}EflF.nF\HzpԠ[S_;yٹ'} ~k ؏ g6'%V3]ؓ)/ʒwʒG|dNIC ǮIcu`t7̏-x=HC浏7d&b'' k(^9^E;{I"6Xuɫ 3n7(6]x?˱m6R;b m9MG$;A2i~7vNvJGv@닭_Aium EtLiuvCP=hCz(HΦ݆\(ncHw.k2UXOz lΎ8,v !*x0 Yefx8~%.Vi׍)T-ԩaIlCo=|^6Yb;C$3o:!>#S -աr?)LªVuia: GoJJpc;aoqs@, ?#Ҿr*Et&W76W( '3@‚6k^TXCTJŦIڱm&Жl:u&C3u8h?!bK4'txS8 h!@)GGpBmt#5],*U(J$k!AAq.PӐ1(Χnm`FRHݜ2u‡гLl$pF4d<(Sgls7?BI% `~=i},(\>Q0P(}ӿ#9ݨ Ȑo|dCMb}a #yba43D뵓l:~%E"eX8]@ȊQq,F@f{̚ {)ʎʎ{^)%jJ ?Z?YM.{=oo_}* UThBu dK)DmqڱǓIKˎoRپ@IS'Pz cfk-pw!9)6t:{1c/o yU!oїK˱`T݇%?L;x4;e4*WÜg܎RӉ̿29U܏8}u ñ2ktaء~d4}d;ޏ$~i?dQ"&AI=$Tْ+1$;2Ghdv7O`.5G~俯;I,~;dd|*'ku⋌=_0w?V+F~w~P񏷎Ev/c|㋮cZe"Ji $lʡ(?%3U=C.eVBxHLglr?_riU'8[]J5%݊Y7q]żmo&MZK~hK`7R0#Ed AO-HPуMLiеB'O?Fa )9T"P# gsng{ =F쟇<:{cazWVnwixhl3Qo[7UFTn9pګ8 &Q㨄sLrڬtM#efF9%ilky#j+ZPaRƷIA4b&wv0E-({~5%}Wc=K\}2wED)ZIct0-xWҶDilAR2JhI9'< cQz*)V$&J"дSMKhly`##=ǘ ]ͣ}`gHxν4Z{4eS@p.1M`YTu):c+.sqXX/ sc wB p{ԆӘ{vJ}3HoUnᅽ-u=X =Ip*ZR_\YzYhjE(&#eA1K"y4i&4O,T4Q~ )VCOK.N4jNۿޟMBGuwI_ûfr16ˡ/:D]9\, Eź JuK4)xN\.>7*hew/k~5Fr/l܄wL}.*7ۊngw_g_g{{w5ݿ TWREju3:矘/qkgO۪ ydܯ2 I[mvû^;(רY&\ b-&Q1E{޶M|y9_la3jP ۵'02xL̵m/ސ-_"PRhhr?M[0$ q4eZm>Q ,J:%CئD^%묷R\'ÀXquBq@c 4('PXu)0ėDQ:`0L5NJ' Ẍ́[ plIM˪\YsC2p]YuZV;%$G8SWMėO2(\0Z|1\zYygIP=\Pe 99Y5c:bi!Bj:39 &=M;\6>hmţQx4@J:f X=LtpTw d0$xQh'{#I)Y۟4r CJ.9$eF C r{1Soq p\y]SIg 9-iuȳ.KFzv" 7b{(P]R˹\v:^ oǜ-G`LY57 t RωAvt>Ҋsm3{Z[!Gc~7ӷTd2MOV(TuM]Ӕ"GF c?i^ॷS ӡe  ?ʿy- N?\.Pʕ1Je%}(7ھ~ pN00e@p(HY/776O+u5cu9`Y`YbUSdpdcSJl``/SOBk$"{cPC)FPޛ2T1V$p)ԣ8f{(zYf͗DŽ}VrE!άB*h,΢c>xe#qRy0P,8R鰋YF7>kz tK/|io9:.rTy"ߎ^^9l/pM(ܒa%H>\em 8$+&׍3#7]KD6 Ŭ^21+ZeR ZoC u?kej5$\tJ޻k.hmh"T =n9\#0Ddy0Uy leSd *qmH2(D|] Brxa.ϰO{a {,X D-j%kxߵncIM3+~$&0hm6G R`J7ہOq\1&Q+>b>& b1~u+8tz92&Ox*G4D٦+}Y'E6kЎ1Vq3=qH2S#Tx%QDiQ0ʵ4hooo1,rr&X 3ɹJ4M>e@ĦT:>):v j9r {dh@w'RڀKثj ۇ 4k(FdP}F~D 6% E55%񻨫Ȯ9$iFA21)dٝ+QNr6/ :ཷY3zO#1=>,\L$ҏl_ŶjpD4MrEqU6Z5q wGn⯲R17#Y0ϜySQp}L $猣aO>骱DV{9+~W w xrUV05Q2b+TӀsHSā+QS*=&\0IX@WLF_ɥiqR17Bc=Oޘ7Q"p 0=<S< R.EFrZOxk|c4nю1^_^%R,Y%k|Jv1k̦MK xx$-^b$aSak-v9Ɗށ~pԵ[܏$ ,z"wS]d>=l{񻳈 p~cJvF"tލiP0sw=|Ryeп~4ײw_\t#y51"vAl{~绮 事0׳n-t}GgS0C}" 1?#?iLJ@K*}]6n zEG팛eOB|2\EWK0jQ=mxvM|Kg,Z||ep hR ۉP16;1^z%*pHٿZ*xJ{Se;ke؇~0_/)q jWAf\6D%? }Ws#4s옄kKʘk۳[sֶ(5 ^IBo(}* dձ&Z}bM%\ZI:j1<ܹ֒AӳQNaT rV=)4.J'fUhn mobQ *[2"H˸E5;k)E?*19ny0ѧQUE+!.-`4ZRQ{0c4W*tG Hcc4DKʘb+Ȏ^(>ƕDAxȯ\=H #"捊Tdd ioZL|0yԚuUM"АAפfƿSU6!ZWfw}{3T _5fܕaxYvFvyXnq zslxI7:܅㵴[V ks&* Ĭb5řM!մȆXVO,@ 0lY"CaIkF##Bek۳REZH/I|`6:aոpe׺ Ǒ'ȣYI~@`=fdlj3چ:fRADo0pTpx%1NEL8Vԑh"q"ZRƵW$ )A ׶pI9N2 ]5CP$z~/V/j*mjZC;L11XV^1\bcTt9OWx 71D=ưbVqкL,Y!ue7ϗ^)GƸZB7յd T#"*ZgZcm07g0͘ GY&-{A]G Wђ2fOX<jNZ T7,h37C?]f9BYβLM.(A( [q$w<Ӑeu4JסDj^#ʭ{@m<_b3PECIGËi*0ն CH"j :#8ٌn1BG8'ql1q(6˂VaY(DI$A6#L^%eGl5F$R$PTD7&2pĝi353*Nbԫkـڻ/~R&wSeV|7\A.LKS1BڞG#α`B&ke\VvŹɘ(YB,9e)<dC+2$*HH춞*ASTWPצS^Wůe6XVAwbkH}J5p:NzO}{%ǣ wj@bFW!™@닽 l ,+h,+y/о(ҡ@hdooĺ` jI.9b}ֵ kFQZsTeU3zFU#ɕv)$U:??SUZ(=,1h ВXfE˽_n?{Rdy򟮶w[uF{~%V'FvӴ/O:R%[,)@ ;b6 T89ʫ8٩3ADSa \aiSʹ۵n/ |n~_Z p?xKp eg>Su /5KC:jgܿ&Dęd wtQĭy0XY6qBOlh tVAKLW!$lchBJ0~${(w/[P&^McJq%_J[8ো&TUc=ӷ aEl4ȡ5e  HZd&{iNNH ;YÝLXL 8I{_k2m)VY%bB 'H':)[P)S­@ld+_9kq[%Dj@pd%䠚 \op:>3w ;!u'd]ga5R9G9&€13EE1XNقJ9oBilsjYQ-V#)F" ҅@Z0[@zH7NoJ_.^F-ݦ% cZڕaUHJVf[E &+RJ.q;YJ$蟙 VZ?m]Sge2W%O߰4=_K^FgpQca%8s$4$g=A P1+JqLmH׼EQϛJˇ8~(:7>>8Fƹ!P;fp98܅m CK8qu{?j_Zc/v:.}ɏ_vN_0hޛ{gCϜsw;}띟_ug{{8;:G~b$ב;L#a^v;_ًvCoѯwtyk>N*DsoÝ+#-)/0;==mcVLz*\i7n^eϱƭ qxx3{ڧ~?tW'/ sƦW>]@C?z;u?y3Fҿd\>~~6cCW1&%juh>gb-fj(!8-ףӯw]=8{7jc}^qp9>͎/S-g8.> XCck&걻z[?Zm^g&~=gx??@g3.;s{?y=&G>Bhqqꅹ0qz=qiCOnE1SG`G`R3; '~7>]ҝ^oq^Țj8]ЌljG/aKӚݹ˞}<&;yLAz &дDAWC]U( ǗsШeL=v:} U[S,betٹ+OXn]4Y(kD6da  w>OE!d6 =ۭM`2x*d^oRvkZ]p/Jv떬o% hOOE7 Y5bsTn3ȚH:Zy0z(X#H=ϱY|N y;!>o'7{>D%B;!!1@& L7<0t} !W;Bw`Z CKgyJ]+,+s<;wҨ̉7o yrϓ|>86ɢű%% RHwcgb;/_~̄;NFV3#[ }'Ϧ8b=EӲ<*g=>3K] k;!um'dS. L,^k64qsr!Lq!ǴsNH% *d%MIDM4C/0 ^i)$!C[J$Ca^dON3Ie3L=f!x3L5adXCG ɽf8 8 8 8Bp/ on>{E)1!$z)4@@@$"b%|%5$!&H'7_%2XGjLL<=u7{PfcF{>|:"J Et(MIy| I N4!3FHB%IerW,l0P –,P3W7#gq[8W^e%dWO&X#Q:#Q:eܒy:T:\!i0$a@sX3}$N$IRHꆛZo߾i(طtCەdoTpw@9uC'vS %EC-Yf(d&BfEDyI\>BUQ+'缉%"UG=O[G0DF&(F6B4Όmkl(Vy3gθ .sSmh"I0IYYKV$sZ ̖꠩΋@P0Va@:`{6ST|LZq=Y a-S^I@GL8@![|g"UwZeODRTIbJ6Xg$XiR+杵~DO(sk4)6C@ 1:x (:301e'`aa#ͅ_Rם<`Br~"t#Q8m~:c.s7 .JQ1uNؐBHJ^3G "#XVpeld-9^KP^'O×I)VT1wD[ ë |=~ jw_(^?J+WDӜ8lhxRt=%ȃ)m_S؁]u: Ugz=-쿮x$FiHz.(f 248؁[%S N@7eX\>.?ѝ L0>bi,J?6܁P@q0o(/Aو•#*r#'Mm0I."qX2(u(v:QS$ףUeyN,4F*z ~,iV G%|"gxP0B@RsP@`HU@27­~rDXqgW"@WyM ~ka*V)TkyȂHLIRxl!`>-b`&R\e;+WcS}1EQM (-*aJ[T@<kDDĐ@ HZ9W Dʖ0G$bL,t/77uugq@"p!CV_h %Ql\Ca$ !8" svV\+-BZ‎Y췌t+L򂜖'n6)pD$6̂b,e\B%wnCP1h#)a@J f$,_"Zm:.XJMS^r0 q ;1` Z h+^0^8H nOsπA:SRS&:c@FHs&K n"ogSWA[$yh>VIh Ǿ D~j8X9g~Ql^Ct [JYpKomܑyK2fT .2("JIuV^QL- o qe[Zq:d@9XA7(F\ULWG>.$AiEۀ97`]4RR9$,<&'D.\6Y{0ko!$ 7{͞^p+&aY$Њ:- I~L×{$>t,4/Iċ+!đ&zi 2lScu8# 5`oJkQ)C+bM؂hvml&D!JYQm⹄(M/*DQ(N2SQ.4g^V$BEd˖D(Z>}!ҕa̘DXGw9ALL7|(.HZ#. V$.7n=d#oolH_h˗F bP`1W6TmDG`Gdc~o3! _ <ݭ^+ˇr!$NSŊH1 8 W M%35*TQ?.1Z%Z],Mqts͆!p0}wxJfI9,? . j?5 =z|4IwAw6c xU['?~Skٱw2snwP8퀴l3:xwlʃ>;Fu(44Ku?'. W6F\8^(?Z } m*wwwփ "njp8'348QfNnppL$O!'AVhr[ruK0gJH3}YF{v>nuBtAU```߇=n53/ۖ +=LK]ohI 7卬KmK?:k`u\Hk'+O!6@~>l~ U/wۃ5D SҠੑKLTAZaysR(Op%wo\GC/酉ˋߗqI- `N1蟅6v>(^VFѲ[g65j홓lf6߃7/A]t;hl*H6%I6jRaWvЌ[x$lx|:Ӡu&3UCR}" 62stqKL>#z1,I$z&t0bI5{NÔHAΝ-> @klڍ_Z7"lڍ_k7~Ư]_;iycVX<"z!iE:`rB3ppDs^"v#A\vQ_=Up^O#@Qfo^KIJ7;q=h|x޷褝$z+@Àdև^_u;- !@LhV=2/uP}30/Ww>hD<:g' +˜&yb_//gd 9*fz[+vI5g'E%S?nm!;sq 07Ȭ\Gi(iZˡGu0t{:SH1}Ux!dղZtrݸ] ZHy4XQ0}3=KQ קnZqz[r0C2 3!Zr1XHl}'=N{<ɾ渔`trK) i&sd>`͢PP^_e;Y%O\ec9Wi,J?f #.f`" /yEU3 &@JpuqT/). bn fYdkup(\q/aJcei3E*n:F/I\ϐB2 `!H \ *DC&) p$B@EٻJn#W5d<,bo q xHcfx}GҌ|GNg]c`{]ŏU_UEi% a{yK+eh;9>hzH+\M%!v3rMÌYl|-2*f%RKS0#6QQ嚭Z}PKN 6S{eg̰ e^. 7oy,i"0n8{&r_24HOe6-&Q16GȱPBM2f\&Q  54LJ"m/Jٻ}ЮQ>dC!*B|w@+(e@!W7$ax5j5KJѪÔ5fuʱj1~"c&uHY]կoNBޕԁ{9YkY{{e{욠y8OO'")Da)Wh&!&0{S-QM% *&ۤT!.$l>V\l Im՚@»OB(]G99>陮)STdŻCLlTw2xYAagoʤ`0sU@hꀅ4ETP)}Ui?EcYVF9N8>hi{TcȀ1kE'81Xb.ؤ7oჰ.X&d i9h9T?S` & {5i,1~#9pJȃØTT_>vHgh|;<'ܔ,es8l^:6uŶ*zGEz\6\r+ xsZvlE*uevdY(`L_c3q'*h4x^J&FrцX*AAXd7x*^V:EC,Bjq[1n>,;6让G4 V93'RXƭ~J06m֠wb5*. H ?Jb>E߼yr9uWQrR,ʷ+شxXYjݟy;vߙfvӥ!!(q;}ȱA`";Kx?ϻ*j>%N}&} b +VERgRIrU_ @k1=Z5=Pgzut9Ri WKzkqk= p "J,W_k;-|co2s ه/_ǿ0%m\EwWvO箰]i6Y}EJM51[7_n8OՔ^={^; Y<KoL[clyw3r9:a~*bڠ0.IQ4pl픭!kBXL>؁snElHՠD2K'YvAs^!W/XL9g}ӛ(]:Tr10D0;6gL0S/\K 5G~WH7̑_Gd/rq 4/po.eZHJB~wV@sA5m]9bAX[ 5 wY,Sw[=HCw-8 ko4! [q &m/_sͯO/* P@o~ϫ528.'1IoKLB SDJwe%(-_Y`e98zWx1GbUlU/kd7:_\v6:b(nВR\~bOxOk5X(Q~9Hxtc넕!;EͭzG]ưj%<[mdz}\+iٛfHCbC%ALfŮI7j9zIb a2W]>o̻?=[}pf߾|ngO6+,IhEmxv6Z.'~|FNM}iQpƊeE$+eƭ2YicT?Ib)wm,>>.G5x$%䦩=}}9DdwO~4?E|$LpDEo =˞e%ӆQ: Hse5T娄ieW2\,Ln *ػGY+gXi_M-ūɆڽ6\mLnM~y ܥގsWSKm4h+=_!N4^m9ښ@i,5PTmdEM4Рzpz6B<ՠj(V.n:rR9i2Lp1a2/e -.%4.PH%ًv4,O {f+|S>oB¹+0PsRZ-X 1sExw g.bnol%R煲pߝˉ 4GEIwpaoBT juM@)&uONԾ/_򗯙;ʂkt Q1nct[cϯUa׭cf>dc;]o{5'-F4kMKg{_~9:>XqQW+:OU)=ܯ?E@IW" +;q"erg.H8՟r-teGW|eOWO{_pYɝvK~fpU*ӧYNs`ktcuOQL bU]qYuH b ZOc) 7ɋk߮|Fݢ~6t@<^&$;PxF!j.-\VU\:%=|V qRgLj>)xf~o'*H2{AIҚҬwaPY#U*A;2#=MgJCPoɘ7pu 3O1?ڮGx-/ o"=j㌫ }:>Qlq( P]s3@a퀮@R-ݼ̌u/mo5~oTjiTk5kLQa]kDY!ZFJۛ<|}}{}`r0\C=SՀGSQ8[=ܐL+q;{p!oᄰWH\)mXQ2gi[1Nӽ YuY$ \@Ǘp0H4oA NbZH[9ޫ,.jrs*[:PT Gs.QT 8>@iHB(Ɓ@|6~Mz8&E\$8s2Y&eqD GW#(\χΰBf(]ΫĊJLkF2)ͤV*Ì O[lU).~Z7G[3LPeZ<9D4{A1;"|oD~igφ; `ɷ.ᙗ8θ%K:AB٘C?ql8 H4/y3PS%Yj}x1i)ާ/wRGŁ{H$^lbn q~AM8% 0^|#kA2ظrܡLƕDj~$ݣ}:}{>-~F=j(߼tfɓ'ًp8Oa++}"M`9ٳuEJ"Q18M0Ƃv>2`8E=Ԋf&U"6ɮc5MMFt,C0DTɠ}Ϭ͏ѷ?*Qd1(t$wYf'fh9Id2:l:=I$b{Pqy[\r>}8{f¿K,jd>z[+NbC'(GYÒe%o,0OC5 ])FAP<4-$Cs-{o|}7\ S^pfQEUKJH@aiNCyQZ4QMo>h2)d.]F JƮf2#Ӷ܂5pyX1q-Ăᤕ GkEmлS-̀SJ>R_ #5K O_j[p^bX*h z۬3Æ)XS776MߘVpE!mҔ eV9jcQR9EJ㬪$%F5+WA A7' L3FiIxU.52EOm4}X֛eA8@|7;ljC^gU|GOC Cz=u8E'Baf d_O?r*'a~ȁ)~Ar8~QpL.UND'=!B/Z1 Q= WB(RGKEwX+OVeλW?8.QVc(u`ܞ}:U=g^ZόSTu@¢ds=tf$tv@ DPeuκa%XmDaLP+7@s҉vT*5'h2Ȏ[5M/Nгb&$:NHGP'|B+v9;rIː9!/V Þco:n!5-j@[m p7Y jvM%R>HpsA(ɁY#[VZc[e$R97BJ )NDX _SvOH*{7,PJ@JhK2נԉ6IwϳF8n`[a*c*iUP8U\f: 1&| EJ`q2jX%3dϘ8UnB(efJj`yS;pbU3_Bc8!ؽve^eq̺HUXY`LZ|3LZIzҶ&~ ҂2g= Tn q"KB}IUzXi SZW jg>ta 2?N# @Zn9H6m9H8aL Y􃋷` OIn8Ƙ@nk즺ƪ\4p J92ghVJU.M)FLh.Kd@1[ U0y\ި<7?{JY7Np߳xrtLHn{Svry gTHEm_|k:t.6G#2xK˱~|lK#r%#%V P:7"i(+GK~ JX\VdEF„ t65A 7rr Tmn0!9p H~:&$0~c8k;I;AJ8qD㴚GkII}|!YydTZXgcF+V{m[|vc2L&D1شj"^GT-df |Y3I%MgY}"G>hF4 EΏNs"G'M3JXQƣ@|qחjׁJY]&_Q% OՉo%( h$mlvz5Zw`p ׾0uEasC/f!8[yLp{upEi+ 9ĭml[; qe[b%ķcˈ7@\&uݽ"xC1LPniHM9RiimH8zH+8m^FZp۽|NѺ1$(Ӗ,d)7T]L8ִ4:6%P7ǃ#P"4Lh)NLw&c~&ݎ<3C ALJk7;Ew g,}~n~c̘rpuT_dpK4O'Gf nQkp]׿/0{-z5_w!˄\i Q\!<k+/|/<%&zO۞=W\|F#qA< $gN!V5$:r̖z͑rӽ΢@uI?Q1OC(߿߂aT-S[ Ak Lx'H" Dw)#(PTvNz䬐{E h0;E[ *,e,*!$`X [P-'n~'uG.҈QXrՇCr'4\s S_,9@bhk!G0'hU[P`b|ԨBqI*oII )},A c_wB =1 \R*p(&7H`z44;㉢ EU;,wY<AI?4VEGyi5a֪~$>׆h--E^ VbJ"Fi  (pJx +G݂jGmڠHՕ7*6ߘ?i (F7W&R @7o*vS߆Gw43LlETɣ%M?tjAΩn nV%\kU^*Z-yMP|(N"=v)J |"8Mqzk0nucĵYp6w~>М!LM5f@45V;jbώ>c2B!g,It'YH{gcbBZsʢp-:xyR g+Pw(l@ke~s%WpyR1i4hei*_A9-XhI(1S'/Ue*H! =qͬrPml(RN9-gI"i + 80]_'kq%3* v&Q8Ly(9cޢ\P'\]sh,Ese)+K\Y9 juj ޥ@%}Ogi>KY{BFSFG4/1v J (j"Enp@-'k k[G!4ƐYLR 2A`5*VB́A2"j+8h"LD~aɯ qkLM7JKB^Qۼ mxX5f%4P92UtipJB 559Ck&iCnunp,e).+UZR3@YELQK Ğxv*yk.V S\zU)d3h^i%5h[Yl*bT7] s\nJ9d)@C[%zC WiAY 2 $i"*<nC"E \10ꚪQ2>*VA0)PxMƽwߟqul~NőNnީB5|ٙ>J>^Ux(g%'| h1Ѱ?O{h6+- R{a'|Ϟ|Gu l"̥5$`q %QKg%KMW ({[NR4gܞTt$ MzgŸ 9k)@õ #4,!\*`z!`&(b6(c(H!yi' d-e YaB\\^@9#P`~/{WƑ /{d者n.Ad$0fVN ߷zHI#r(pf("EzχӻCbF-hZo疤54`gTSY.9ḧ́oeKՊ{" jmc|$ {/,|J)aF1K$Q Kb=ቚA̱$==Qk[HQJ<yigy2P$Ҕ&̓NA\_t9jڡfK[J [u{",ݳ1`i~y!&m3Oߍ+u{)"41R)Y 8lj[Hc) iGzCɿ۰;T2tWn-Fཻe~7ӖGY_/D|QX{oozKs Fi70ȥ@B"ᄏJn·zh%ٙ{5$>͗2bKg؀POFS.e66aG!3b0ʶ4wɄJ_bB0UV׳ji[Z8uE\QZ¦n+Eh݀r=܋ 0dj0"EbN 2‚$iz?깚T))ǂ$BUh TTmUQ-?S[>S[4qz\TaZSd #Zh'&Aac oYNIht"Ԗ7԰/Y]3r,$5\L}3q2iMMj5%9U!9%L7;x+\"B 'ا=+̎6 m$ Rf :$|;H[dRUR)tiϪ`LEץSX#)ZٙJМReUc)rQ 6VfFn ( n 5ʨQB)7 'Nk-olF(Oc$#ucqjQr׊Ԩ/WcԴKv!dɎZa礋KR o:3Bprr!# ˜eA 0P8k7 'WۿI"mj=32eMŨ ʕ`zFFb&6+V&=JXVz1)lv D3eNh"*B&A1W(XqTP"UN/'; rr7)5gZ[OkƀE`x/s> *Cq&0"K9 F@m5w~Pk^cdXe 609)N/Ϝ![QFz4'ݓM$= &=Al#=mPG%}hMnxpRy Q~ss:k##sVf s:tP lIAGx:#UL ]j l Jx/*"P̳fJj VQ$%Jj銒ֈ@[ `T(m1ixvJd5Kc*ujS +ުEAs-!;DmYXhieA*CdɝMRt4KDT0XۋVrLլI,͢؀'˴Ƙ(AN#kA!8mS{YV- ֻH2RI]=kRG9*$AlA76:A8"԰ޮKHm t&%"2^!t{pRBFYt|H&)l,tKt֤Ր<2 vf%âZM- Z 6f4i͚iw2-†sؕ~݄F-DesG?֊Fc+$l1OkN:YXxr,Ԕ:#bnnOZ_aUv φ A"Ц#Xm\6/ ,OҾ3(nO&/"ٝdNAnsOyHvSWgV=\ ~Ӌ{1_ F5$ms?\4 3RE%R,C7"ÐQΜ~T^^0+'Ug>=$Av,)ݣ6B@шRzp8tlE_U 0f0% 𶜉>Zvd׷nKv$Ȥxf ,Ƚi^[W~7b(Y NݹXAƶ3;8mQӆXMnH3~Lg^Q ${:z*y p/PC[&WƧ*(ܴQK -H=ښ8H"yOGt:fČiY40>Q Ń2z:^ET,թt'؉<ϩY1OV?~֠>Z,yTrTPϥ%>d)hg<]2O_jqh3Q>0ć ~Nр+RJK2|vn4Ҽp&,{mח@]_b65s7v(ˁQOi=wrCZ8fGl-Pv)UxwQڲG\Z f. 1Iӛb.e^tM/Fg}z}S|s`#PvbabP8+py,♦BM2a@I:k9? ФQ=FڂT4 hzrf咼4#waZmow:'6 3]Ջ#Q ЂP Ct gX-A/6D]GI)Q!wU-w蔛j,"2MG0u$9xM 89\롿AB.9?aL9ÄZJ^mZO)]Er–zc/4zK7('W])歩gDIdڀj- U_|}-7aaTxJa0Vt-/<@]ZksPcv>N2SV 20@L4xTY돓j{Wd_t7-U]A]^tzoxdJ&2ƃo.o_LSj_\[zg""ܶ~6BZѷ촧Qa޻,JnU(f1i_Kw;|BKw|Q&y_ ֵ_% anq pYa1`rg:z7 oI{ܘ皪^%ShP&a'? >ψEK!| &o_~ؠ`g顉쒀q,c3?~"~@i . |=<߇һ_A"%<[gW_ aY$~*E;dR2}A+&8$ێ4/`E J-߆_=]TkzO}:c+v޸E]:E=> {=o/`rY?ؚ[clv}zQ Jfx3/taVD1l~0 )/R{'YuA0W6?]a}p!=~}Xw$!/\D[ɔ8|nd(֭-uD3XZ߷ukEZ.$䅋2%p6[Vr(d.MwmmyY,{4]f{, 6/3HNv&Sdui-vDA;vYX"nje!IQ?-8M|lZu\GLJ / R,,<>OϙM? PWGKLUWMJ?aL&/deyVB^6?smg^.~v >Im%EO|x' KDpDc|)0m{ x`62m]P&ꋓqD@JKz է;(17*9Ս0ݧF1@øNt'+ThAEfD8Q?lvCW-88͕"4|FAOKž>QMmp)hIIOdcwʰ\T `лv) 9m5)zky.}C4cR GUAh7aD(-+Mת|"5`Pp)s7>崺nEpo'# 1Y1h%(QWBPBq/jicM}kCFV_)F!yi~^6_*PI۩oAG\B%6 ʓwSH6! ȩZx 9hҕ&kd'ȅ8R,שpԝS Yn0#Z8 )Bcs>&gύ75,=QGn܅Thp?f2oU6o%N^n'\ް}u(Ja8lX笥qBJrG&xgJbNISS|o$ݹ\?X%ĎYݺd|}Ijf>  -5'V)i^~5g8 2%aC`9N-wJ0B-Va]Уe[{z7, d֠Ig @'#AJFsJJhXB;͝e{@Ɓc\(X d o`̌FELc#G[/:Ap5v@F& .b+Z]j Z x#RE)%-AWD.ɛRAg7u:CZDQζ3W Pwaa+Qf΍sQaƴ6a&[^#\Wx6A$`j<|;#qmz<`zڤ~Y֧k#f` ~>4l7pjR8H-^oBͽ 4pchbK6`RJH~|.5>;ޅR rхwpqP)U<]Bs-)Ɏ'oz7јJ116xCγL6VDnn}Xwnk63m_Pʖ wPsƏϼu%6:Puj.S el@5oRRE*쒓pZ5wFG2X˭VBC :Tx}(d8}<d]Tu .YMбF|;Z€gùI J u-iL\9i}6TdŃvD`S>Mnwߧ5n|g߽~򳝾_.~l&#ܫ-|syCyA'.} 7+e{.'%ʥb)i҈o~pUl QrFA1AG ~NT dK?żfދV|5 ĦO.*戍YRNsc*'a^PF9ҵ_GԑFG05Cz=܈\9M7dZb,vS>/q7iD?-Q@g}N* -s0=G AV?{uZawٹjKYx~-BQEqcDfE@g:)jFgd1*042K}RXu-eN E GkqF$Zhcͭь 58]X׃X?wxRć k+aY @U'h#&QE7α洴pyI49~{8+KW-m| 3~Y“^kiD^k":Zm (y@VRТs[2) `K#Qk4y*$LZ'/h RѭFrAaȌRw8zh\XĂ)rzitDM]sJGndžACEMgY?9v~C1d+9~:PCQc$e6DA!xHR=c>]lb`'j*AT1!f4| 1  Oxz!]hTi0it<([G1F%3̨ީUG Us`2QG1 Nr.Qs4lOn҇0{ZZ-sҦ)&j{/m톢4ZwqJ:'Q~̾ NSq(} ʜ.sVmw/I72$L2kdyE*Ⱥа c.F_rFrsc `/9 BhyڵFWt4k&{W9أ?2;)FD2SGa: 2L.cl&KT[/B^Rvh=#=j3^(Tԁ"(0ZKKQ!JŌBַT㚰h5BY#REb7) PPoaV_b/vWM~qx8&λF{< wv% ZRm*]3S ౪qU0FgA߼ 2gN;.cku86NN11 .]h^3ܛpk<1gϳ`\-}3JMs7gw;.hlffE$j`CoC_-ISkenV__+`E(IV[Kuldu;J5B` JWiY)MX T[њJf*P eeY掔tь3b[΋9PD_ܠ?经7tij-m|ˮ fGP !/'z0~vw]9S")<ϓh(u VJa!V_TduNo]}Od \]_7M`3W[Y0 ݑUֱB!4lz:C˗A[_ke BF-o,hp;ˢTpqf_,k)iNHJV:ul5.ڋ i[zED@MxoLu}%f,} q3:.BkIF<,L?L`-PcGʒ1OXSW/H9Sx3hck9'=jnTͷE6 :,hWcR^~yM PAkW@yǫibAv)@')IZMk0k]H׿ \+T39շT+3= W?U>d?=k>sEozqWw:&hsc | zk`h^I&*$7S&½܌4xm_Xw-4.ۈ^W`}N.6JlS=ƘKOcqogV+jg.UdJvunJtBUGtQFtiVyڭ y"zLf]m],wm1)o7¨ ݟFK JlYr[5V˪1FT'>d˵joK$NCB9 ;XQH?+~%1yPnR_&_Sv࿂?i7VWT@SҺV9. u S| u/!' QȢ:iiL91tb6$p;ֶ38.@]>g_BӟAL.mh 0O+}#w~\-:K5gx+QZ+w~p_&'>lu!K7~&.f&Rqoš<޻I*"X ԫJ$Hƚn2|ߎ#7rх,+Hn^e-yJzp#J `݇Bp:SNSy s3.ќZ~aW~YbzB}6}V5% qf'?m2ٳW 5nҰ?wA/Een9}>I$S)SpDRtbHw0AmE+Y<-tZçti%jw=sFch#o(&ok(UqA\M۫*V)lOYk*D;{XeKv.W@̗&>/ 4qf6 &+ۗpc39&RW!FQi3(:͇D꣕Sc 0"تu7WFCWo&ٿ^00pزo&0a+GP[q-KS&y/k~ӻ-\|k L^R\SFI$d81f֦@L>=y70yE }"bbC3FDP(Hc%xG1Jx(έy$,1A{Z ڌ-q{1=y7~8JqGǬzu>nQ~V'`ٮI+R$N ^ظH}P^o6Kc&Gh ]BHŔaԊ0F6 51g6Ύ0|Q0b ھR'2"4Wjb&  4ZG2 qY,ѡJ`G02M87VI< @=^0J =ctrۃi&3F kܯN:AjQ{Q긴3z.q ZY3ʒ^wZXxY>,YpYz6.W_xK;?ݺJMF_qxCΛӭ/d\F'jfOg&}*z?q؛7w? cZ&}ѽD c>yȵ[tpYaB[kod頟a: ? E};W's otjlYmAH ٔaJ$(Xh!l_72_^MUcG^Ձ^?͢4`N4aђtc0C7?ZBEPNmcr`TBa!F!qڹ8} $UتOaX8L /w 2?E+^S;m~ztKxpO~x ͨoŏ{Ot~shV _p\ h.S*2f[]5Y7 C0O(]&v0l.$񔃧P+W$UU!b'OF<4FaC,AbHVE SpBgqׂ$Tش?ū=c| 1T4 ?xcX\VLep"x/+8^h 1~Y!%"Hv̞+$f+72~̀:7%<[ #6T`_( K\p^V8FrRm RV' JMR&Nũ䌥)5#coP%L8>,aNNֲk:V*F$e&Y jIBi VdSpRi2D&9ei "QRZe(JX #$8D!ڀ/Le6EFy⥮}%bG2;*!JL=drR&Cp fd3KW¦duI(Ȝ԰S * 8Yw[8 D8[1(imI>}K>T()W%;Ib f"4Iq\x3AV>"8zc0Edg'v\syk~dhWG&alH[#}J &TL#҄QH912eFW[S)XSlG\1!:NNFqx}',0{{h'·KPA 99U/sOƉ.^xW/<Ϟ&7&;Գ$Ϟ]Ͽ񳋗o (,84n,{B^ |;ȵϮzv}NgPw6B|1eNJ^}y N!=/ٟW7Q6<wYͽ1.d:5Y̙Ox.h(>M"gjŇpA2j%y<98tv 526ק]zpe1ojiz%d{z;4ś:pg]]"w^ RO0{ҏ0 4oN|bnOgw`9slΞ?T35v,;'hkg8WͿʟ>?M@Ѿ=f}?t[Ir ognO?'XPš]BX>]AٸGQ':x;yK]=*!B$5#͹ ȏQ\#.B9N?a\bSjO>Ar e?)yP E9%7ӻ FUgWI6x6Ͱv#֐VO IxB 5P6)hδ$WOzRKjP̢$X:Q;ҏOs˕ MɬFRth-ڇWY]/4 MwtHTm RYv$Rkq~reuj>&&FPЀ1I!zQ #҄1̺XB=Vek|$|$6/询͜j/kek3(Ne5Vo/ƁRk``c -Lp{ {I}EfSxtkoyL)N~_b޻}1Q.. Fek$pҠ,縃Z{|AB&u/Cuq)]e+O\vV|sLAoFZ #Zqd֚u753^=cm:zh/Z AvH$`G4ΕѭcZ*y!0z55(wh5o9Wweu9\ =䡿)H< SJn.'Vyf&МĩLi3d@`[\!%Y[C %:5b,AX^ٹRg`DDڛ`pK"m]V\gd09u9;(u]imFkDiM;W[d0MWu3wΝy!&/V"ӵp `aa2u 僚wӺ9| BĺEyj*Qkg<&WG-!'voyFYo8Gu+nD}vh>vmm/fe샖F/iluz"gL*g*':ɨUR24\Xn:hŜ5Cw;raCaS-=^XhGPT1aN -)YŞ/Λ('NTP֥Y?^rXZ)ڹǩVCi[PQ9 h ;;h|+Qdto}|e~MlV'rn)l 3w8qW 7RQ"큣Ks=5f +Kz2ޣɼ1̨ "w۞0ksm#&~kA{WfvNwk3 @0Q'%3l:Q +BҍyR4hl>wd3!%D% 3iݥt-նDk6  SmVۋUwUpEdR/Bhޅ_9 ÿjk!4;t5` u~Д+}_i i7.X(d۴𑐎U"okv* c&DH6$R"IN%OIaLNh0keg'J0Uuv{ONp%!$x*0.24+t_d Tx F5}k0eQnfF H!s.%J&Q!NuDCD! #fUTaՇ-ꊁ`ݺ7ݟSޟY"ͭt^45;',nwD5KZV1{ޣR*E]v(u щH eQwmI_ȍpp'wD?em$J)1}IR3=3:t3տWwW9jӶNF<2ܼP s;YoZEIrt=kYj T.hRȐ5@+]/\4RR?=Y¥I_Y*L ́WëkjW)*j0:,%oϢswurnx`I9];TYY,*~ln.U}~xRFk&%™U5x֒xƉe1.IGͼ1«zlsclנg\y*V\!Zf B?g,^?+xss=uAT`SMolա][,Nk"6G( Dr-(ݭq:gmy^"hI U(j> ]5J2\`BmkSdwC2V+mt֠0m-f@ '>`ߢM>sDxp,]?Š#SzaV;ޱ}Jƭĭ.O+u .s{w4 RzKϋTu]2pWZv^CTZ#ow^H͂E!Nۓ0kXf!P3#lKȞp;yHit]HJi0c {/}2{8NXF$Y[bPNanQ$1#D.b:Ŵ5qTwhOę:g9gs_~j^`:$TMMRw+Oא hdhi4NVB'O:h[=D.*whTwdK!LRH8@\D3.2K`aVf/~ұEQ+tg43E*Oq ¹ E簲H^pScy1@/1N 8x_9SYᛰu+JpT*t1bTX8U%V۞ĹFƦ7leS׾n:ϟ?6h v;^V8DyCJWVFz< b|z1nÍx5N^SgE k֭= Ng5hBvh <> Z3;q`#b\\OSN? wYCAP8A;ihVAdZ-.ݢWB{%;fي*i* !C nGs^U-vFay7evOy;kO9t2/qaySv#xM[OYbK̅Ԥg>}\I9ֻ,sa-8\~f:B=*+fl`< ,  ^k_L4ݨl@蒜G{րd\)?e>/r&ۿ@%J >IyQōD3"P.IӾL?>:V7x$@ɓ/!x-/*җoϴ1\zd$S(PۀDTɧI$ke~[x4#_ }ٸ꛼ 35$˩J[kH 6BD:N$ȩ`8%>J89 V >__pJ 5]3!FOzs]oU E }=kUjpcLpέs,<8%Se&,5 ~!Mpca؂"xg1n1:9Grd Gz/ hdCGXj`Jſ(K}(;Z2kZG2''$ZA$ VD@L1yS&x$8uXP l!7Hт_{$aALaE'`|Xr_&9ׁٙ21RjA4b<ɞt)mL0T enG-@hEiU) cPeĀ7ZJ R` h0!Y^\,e7.:RҔfK݃#s<`FZ [Ҳ0`R%(Lpb l!Q%{xZW z0,#B2"BJǹ􀟳@ ɝcH`cL!xr,,Gh=,+\Qc%tDD"uI^yJ*]Ć!k,3Ԓ<% 2& C FOHS_p~ZV.8E; Pk%*X8#3`ob%:OGS'B: NG tv!``,LYI-x-8g+uu181.J`/`20y`O[;J^\* ݔ9#MۆidȔ ڐMD*0at/FpajLݷvө΂j^*oSjt&d+L-v/zpd274 h "M>X "5tȹiΜO@i9 s [=Xa@f04W= 6|%lz7_zo~agW29BwbR5]{Y5"GS2 PYΏ*4BΧTgW09=Ό^0F׷*]>ZwDQ?zke_GP%LC%} D ?X `o!}@ozK0o>M+?<<,_>%qs~8t-\txu=]~<[>#,`!Eu{{T0xKa@_GxvKmqD56/ؿ;o(Vi Gc[yA֋9!ų+q8(۰vGnbJvT"ɹίA6ÑgW29e,IJ Ϸo#WxS@Z@7$ %aAdݚwtq: 5b`}>)yMPL)\KIFcҬX1]؇2UW큦jol%yGWSg',qZ@׎D9ՓnRS޽ؕzY7 3&ܽVʡ!+02YGm8FFCуypS2G筲9=xe;ᨬ%{h{[G_858K2QS,]܄n'H ܸCM"xHw,ބ.dW_Ƥsp3 )^!/A >N Sˠ!~py@yi}!WH\^=+8RP^ZISkBoWHZNZqSPWx|gpyZ~]_nC?+Z̧g4ڝ3w85SslMpJsMk?\O!XPk,NW-847ǂ`f1`:wFlfI]'atcs3JԀ+m@a&cL25rsLsYocd'<0KD)C[CP$'rl(7޺jAaH9$/q`Z 1==M<_ҿ!+Ы _\_x2}b`ޔ}*Q`o_ݽ &T%f2[؟n{^Jv}[\!jnw7 TZv7Z3#޹-9 2GpN VG0q;AQ`6H9 ;}dB[<vfV3{oV&8Dž$8SkIr\cN3—!0$G"zi Q 5US0¤CԁWK "C*#J#du*eUǙUhsFHӘTTD*4@iL,s-r;LT jҤǰa@yJ%K5.ʊ*M>`oYX&% XAV7L{4v;ZdZdXC=0B3r?[P݇|& C;~Ľ*Gw0㽆EwV9dY;69G$wgބ_Jf;N,imjCI7D Ƹt{S3.᪽YIl*D$sphgJ9sxoFvwy:Z9yy{qmY!, -=CxЈ8<$yK2QbR7{LfB }H-YH) d^"D68.*%AjOJ]O9?i/H|^L=94obSN9+[%$8밑r˛R#9 bL<^" Q@7~Բ'AK+ aQ+IS,q(7!);,v<.uvBObNf O88U-V ۓTJ+X}* A> om:SCEڨ;m_'wY˙`A~bmdNe+Aag¹Us$wVhU% {x3~;<,߯n*?=Lx*O'{NzWW$F?|L *}573xOVbT#2zkO<1TFbQF&iFy,ѧf X#Q/be4-N[Z fbpHzr~7SB-iyKi#]+;:k+?yMțIc\I.OdWc!ˏthvxT;?Fkhx5Y|>Jocfz ?"8;GQ7> NہkC/#o5 Ehxᳺqq(ZvGa1AJwN>0Ōq5d&BrQ A[^V&7BdAWB.nxWsC$vd YԄYəTD{!mN-5윷Vc j+(#Xpq>ǀB|079WaM^n< 㺐4v~ o,n~ 3:F ]@N@78/{Ttzº`xB"}kngy ~J(Q?m"%u21!qucC9 5;-[#w15ZKѥ?N(&/m~*5Eew0yx~EQ Oy1WQ!L |Xx ׽3GoBWM6tX= O_|+cՔ3ѩU|GkNYS/ a/pѬ[֭ 9s])SUPސ'F@4^[׳C0 >2[Q*wfhn(h\M=(nRRuŜh*CDa3#Qch0,_?|6Ifxʽҫp G0K(l'@ŞUkcY=8tH?-B9+兖٨[4^l[y8\j|ĺTk+X[Țnt2gCI44P$(bc${"}] Y*a-@r?T#ڨۼU$ cbpG Ćlos%M=gOh!Ϛ0J)V%Y!%hqH gLMfXP!m.jbWHNyJ-):҉.rǝ"Iq}@FBrIƬY7*"s%|67)(~*P{m׆߃]_/9;Ig;:;wnqdz9D.l0|OԹ,=B#[r6[ӢаHjTDwg:b']:YX< `ON?BHwԆwtharL5}3}=_{t7DRX?KJ$-kxayAc56;P;c>;Κj1~.Y[YS(o_-C4 [rn.HՔ;˕h~="7\nuCD3!%:u; B%rNPjo&lf%iЬ\S,mĬ\wTᾄ`gX/\e)d;H_!,`A`N))w YxL2B3TE&iD드ABtXπ.rT%u3בn-(&Lt )T_p@@#\f4))\&sl]ˀ+)/]15oiwD"%~gH?SY"v4ffz;GíMBI"`mw3s|g2G#Bӥ}1|Lh8uUѺcṗCߴRFrp?yn&+H ?M$/V$hrsB]{nT4m-":Ѷ.9B3XYn-AY_7̔YKynY}]VeB󤙰-4ؤ m==>mޣJH%KCL8N`t9wӯ8?+՚}WVŎiM--حsԊ|y5, ~#yd7ƞSD;ĭ$.nƛK*ɯ]>W&q|WК.VBORŤq- w9(pEߖ!¸\n5 _/nB_ME3kC$f>ϖ{>ɔfMJx;޻r* 75+ /J,WՎB○7!wndV=bBv ̰yƔBg 3IDI J9ٗD#O'nXO$^6pڊ] ܏ggu"RB}9^ h|,yiq3, BJ)u].:iuP!*qY.obC+*GǴ-0);"k]]]v]ׁlnyVp'+"(+l7Pnp䤑 Ikv9zMbvV։BB(Qߡ>.wmmKz NFEl'Hrr6W'CRJAV4H3CE-35_UWWUWWUR%x2sbU V7Xn@ T&!8i6PI%Xvj=$D,3կ A (JzR*Ô1ZG3V1%%g6KZ+qe,` ՅZzTTԑR1"j/Ked n@5}637fCg"] V7? F0N JvߐCb{,#r̀5c!fTMw˪ io x(XX:W3+/ȮZ]`5>0|9Yk3㐬$)%ʖ$-K0J9w|kH ;ȡR&B5)*'F;r}rz_B'wsRȎhĂ0E78lBx6W-)eieSRSA \j+k,,`RW'vs&*uq)wU䔑J^0*^2y !8tU;F:gX3Fwhq⋹4axΣZ+3J܆5[~ +7#I+6Su{I9J_+:Sv Sëw%oX]pⅦTdi&Jw=EGJBPTN~K, VV)ۣO]tլ=nk O{/35;6<_00kX]4{5THtY[]giOs!tՑߔ%m5KE̟$@A0rn{ωŅ=0[g´:s"zC,\`QB+ q6Wgf|1ۋI.rp{A6We2^v4N0v1 5KnPn0- Zy9c5qvtw:TySfsnBS6/:[6B ۿYtrf7*PgNVFb8Zx\ xVyDEV'ڜ9pX TkOAטt5e:$|>tW]xoC)6ievDoxڻqgmQJX6Wh }߅=r^kvN/J1M ,ZK9kW#. D u<a^ cM L)QDwHe%e[ Z_nT>|3^].*R]qT)eS>h[So6bcD=uXrf? <z.cNLpnF>e@zԀ'=?T7Rk5ұ Hڂ A :ag-y#6FA14i ƛ#"Xh$Y Jqȇ+T%{9-{@T|}QKu~՟g߽?̃볟tgS U :- .z.2(+ˆJ '= ԋ`d`yuk'U{8Ze6h㋟%QscFsAZ1^X69&?3iԄ`i2(#uD$ #@~cQ3Zޛ? Ey!T*Y[V~. cVXXX ,lp0cHdVHU$2Dɽ 9C`U'), .5p^ad&в$WDEN;af3Lıbǜ$V ҜZ͜r)f>ț4M~tf`  O%G38vư:`WrO*XXFB6/s0 aJVbU)btyqbL5S8&;6JNبxZPNt6W¨D5GWVz H_MRB\ E$4Qhp<ƑQ8L'A XH@JKVG(@Dmm&g'3 0nSr 0Z5@ɭN~bquUH Π?}Efa"g &GW?on@@x2{ToZ?~zw֋d`ǻٌ:oq1Cr|L,8p7%&R<ݻ&C ff>GW=Td Ŧd &raxE:Tӽ J+`5Mn**7fK\C`}c9`'c}. _Ҋ73 f`A߅_LC&MSLck&"V`a`g!fOkT΃0p C>'`Z57aMS$I&Bއx4 kw݁.{wC8 р1ٛ~FC|k֮Sk{At=Yn˴=Ybf/4Ss˝B{!_V+L/V7I:!bC% L o6墌'~X ̓s7"'8],?Oʹ.]RŦ 5@N;l#tn2 ]aDU Tn 6sr;uk2\ KjBfaxS0߇#-H&XS nΡsexf+:}7.ӱݧ}ݛ?۾ 9*(5jUd/y֟"temiׄfL#Na^jhL{9ԃcV- 3~ ~Sai㲋|zѾγkÿ%"KFz^{\\'ydϋy11/_˴;gb=0IbJxsfvCy)k0A<5ֆcoVa fU @Q^_!Mb)ѡ"Nשu*t8]8w掠 9aEHNPi$0dq%"1Qk:kwTk[˱&H*DjE bR:U\ #Cu6J .ƈD\ʌĂ!*/eKɗ"Lb4GNffjP9\<)2 MjcW-*o`Q^a=h}GZGY#I?"A:1EgYKi))`Z vt!Q8,1V@2: 6E/3\S.zI(HZJPNEquM>lX)H)}EJ+)}9WZR$Y*f[g74F˳( M QmI eI,A5GxX pB3aD0`1LE]sRMҪ?}xiSj^ʛ<@[kDd*symBeͥ쒼_fTYΤdEP1!,ÎK.MBEnj'aժ˰"D.KչdfBh$p,5]&tҕD^r{֓i@ɴyb[?d9M%V(l04C8K!P(,t)=pV͎Rq),"YARk@uxO 㑮,0 ( N#P`kU SF*P Qa**$Nȩo>/9MVN`8.oj bFnK'j0u &V8F(yg̠T"20bB*e@2#RqJQ}sWH.GFRaBSBDՊ'W1IR5وMoQ֧0+Ɯ3Wg[7?̑R7ߤ]3nt7A_jcc_샺c)O/tG0[Gdڙ9੤Ykâ[Ot*;>zկ:xSX9_3r"ZK||Iq Q]nu1cTnr;VEs[r"ZGt)Qv1nk\vQ/w_De!siPJROd4N{xt-VP1ɵֈq'iō0~&UmrVt]'/͟facwCc!)FHza~|_1culӑd99!ues3,U咔pPB¥i Oe 8‹JIZIpFSцz, L&gۓ,1VLӪK}3:ef^ifp y8PTz"l'[1WF62Bq`:o]'Ɉ4DPtVZ e5B+gSc^飙sg{6WG{iϾ(~IdiQA5Q#K(%M~w%Z8Z-h;rg㆏CXbIׁ"Q,DP *fL1cZ!>6Cһ^+4WmYyOHcT>V?L֔sI{HJp,Pi˜LA*}py#Q`jB~NI.E'-t[(N\.41_;I/V&7QՎ\7e0 e usRP;;M RV{.C4qXiA9\{8n)Diz_{(X%c=m[6w(:fgG-b)*01 83 T,iBR`>xH .U5a̝W;z`!B.)b]z d"~5# #2 xB' @ E}5ډ^˜ uWzn}wr0ݼx*^5M dȰK&R?G5Ih \tۅc0.-„PTnCpPBZ&HA=B]*ʴOl`H!y"H$LFk(+)hVmK6)1bdT ( $:z{ZTD DDKEcqLCAqT*4:I1J}6FS.rU2yK 0Lj| \3c^)iΌ t t<?Mq _Lvd m,W3d$CT"T8PK6pPT&R oI;l i^ĕHs-s!p+yJ7^0<"w bJsݵ)xvG *L`>? bjlFP[c_1.ƴ9͖$9nil%څMs]نbh/{5fV.n#{:# *Q#osKUBuI[ [靄VuE[j[&li諄P{{=BN`kвcc/퍎_U\lZs_, -yF΃銬&ۺv 77__,T֛:8bAL5]uf/t.°gt3: 9g Tlus] TbH]˃u*-VK]BgE 0[Ũ:X7Xz :^e)CgU/Kl.]b-WR֙*>m)[Z٤.ZuV h9Lu1ͯnf8(J֌Wl6RTԣl׉Hi{D6$RZBB0.P{II[*l *V[-cK!XZ1+%mås)H* $5(hᶺ(8R;_?~`l+Mգ]_C 68>( mQ-=E@FńH[=3(8f}b"o8^slFj}ż[+ kN֛ ҂/s^Y9 t,~I␄IM( 'LHshjEW Q^a"]d [o0XIi{͏+]&XMZQt D/,vRm_vj"EPjP**ǡE+)}N(\ %Ac^SjO-TH}<$}( r6ϲɮU#4"Bb Cӏ!mV+nv+%R6ncD Uqv+>v)6h;z0mP%ުAlU:;"$tHK-:#f.Q'[8"ԠiнC\5(FXTS)뚎]3tH=ڭJ ̹if@i"D;( TnM ԵL񆵾Od{>VuOs9Ҋr&qԌp[a.Ԏc6ZH/,ԦrZi"q5ͦXhiׅ(O7|\hhFo9J\Nu/{;2װfco^uE&cnS0Pn4 '`)eG$"J+*z17Ÿg IZo:rrJaw4UO~N1br'h7Ν{XƲC9Ǯ:~ s 7g= :)6pbz(V" 9;X̎nJMF㻼Y:3&TFxDḇ=+N92wBxZb@i `\SìSE2EzIO?98sXzHcOa=NY"j `U4TA @_J5|HcƸV0"$҂ӛQ?Bu D˨=ghE/OI ]kI[[&\0]hfO8?b^$ kdl2͋T(=ݘspA/އyVpt=5PvY~Gg/ > p8B@}ew0| 9 Q>R8 z0C_&_ {'E]'~=u4qrub4/.l>'OO_~y<z͓P_yv<~|g~ɏ|oﮟz۫{s|v){2k֟&Woi?]~%qߚ}}<'3\Px:+]T;Qpj:Fz2wt,^\^DfAvY>{ T$O!Zn}ݸ%\ _K&~,.o`u^Y$PӾru2up+ s>4P#e.GOI~ ?̞ .Fş.LaØ1h\<S460V:uniww߮#/ޞ?R_???JbC81x?|lI͆@{F?/ar}FHv)T_ϋ#<rqww(ϳ+ί 2#\:0y?bo&FADig9 8V0Á۵!l}3ˉ @id gݮyVe:D'>AhcɈ |'DJL N 68ܨ&|jCRVh6I\KĎsom2Uܣl"l&W^3lQX˳z 7V`o ltd-METc׮w*oz.bo5L~tXHWa#)f^SK['r X-+ 9@ڥc[h˾>Usۦf 9R=of&Z씻WnՍly@e_cpG=N|.CXH? QTI |T <I8ի[809H =AbR9矏e8̂^S(-(MPZ7w}|~WdT%ifKTe?̣wi^&)TR%,b>=apyS2>nW8H '>N8+,0"EnWrmؕcqʎ,(ݔOYiٲ 篤Cn!PW{Paqf.eT.e,o*49^M/Z}f69brAJ΋p v8:(w1w3Yϼ*oD ]MiT%*m%&OJw9.31]K^qӎ]X+j(mgl'bw^O`K3eom+S#xe[6Y ; yٟ٩ Yt%m|q[7gS_|*BK ~*:g/N-0Y^/Td^/ !>/sVH $}r3O4 Y95DIH&82^NniG>R=y>x{7vaLkBǑ/r. @O6˰ha0|*uZ36o'Xa߮C0EIjW8_VOj6aN<<[Km~yz)OZbqY6qk^bU?joJU!&02  d(UI׊`"U=hMu|SEg:U40ב T1&sS!bs,J1-%Y(aV;i|,$h$8W4pIdnH (QQ풏r4 .J1(5\GXr Iʪw/֡=R@+^P%TW $:/\רZ 8eٮ a%7=nHoq$J8pq8 1m9 FJ#-fo cVHRzSPPVpRb=*ȢcQ9kL693m ^9?^? ^29 "LvTGwFb~hI^Տwy31QC`A'Y)ZGN c1%_:% iZm8< ]|6O Bhw7Gz]u5#+pBg<ʁkh~,0Eb\!a( !y*CRI+fRZ'ڇh,e(X%"(,04/$qHZyibπK.AՖ8SI ( d29Z+D~PcEL0ӧ{yS=ƪof_~˯5Cz{x+ce6,?G>"ćȂbQн?C{? G628?S5ΙUG?􇟶͵D+J{^SU 7C[dv(]i]$>~5I?TmtJ%slrz6;wl?z̏+o SRAa|MQan']Spu.@ Gvp( ,?GjYCLGaEd9U+<ӝg68<=~{_w^5>ڻhbF҅ĹMJX`I&$ RiВA$ƤXn^ĜDՑ4YZ].G'-L\Hi$' 1oKy„uOG'1+O*"iA `΁nϪL 9g ΣIL9XԉI&+{et9Hpp^+F5D*8, }7UG÷W{Q#/S%>VM~1~[pc!s TR1j0 [phCNIh#eYp"$!|_swߒFf}%^^ٮu tҜt+"L;nXBOc^T!Sp7N"*d4TcwL(:h,2/(Pӷ2 '1 n sxb].:L_DXU>>_'*(*&8aj9**#8y J I1 ⒩zEԿ'XrueJF`  v@5 :mN=}^d^p%lx*H2xOJq6z1(sE]d8 ?F6/nKrQ] k$~tX%_Mڐ]^rj @`6$S#Y}e?Uﮌ%竈:$e/DwdzWZ ms^rk"ipJ (qYTT|7dP~=xR]l`z~Xr\i̳57E9c>ߍ&7=Jkr޲^͠Z>-sƴ^|?TZ֑XP9$0[2A+'kTYpT2A("| ] RԹVFƄ>NgUO͡`ȕk<"uRJ8x%JÔхtodlk ;j Y 2S2gZRɓr"4BOl*:udֺDo^xwW|G\zIPk.;&פ Bm&Uz}] V.k8ֻK@icaY4pW*HBqhp:@Y̺| sVͅ!e7DΨE)H N;U4&*`j&8Ad%fI#.(k \q~U֣DMwԘO& CL `5P1#5 QqQJLrl8MR*ueM޲SJ3.QyB U(maU{|ײh!=O;7yҘ ڡ "Tڮٖ=il ۻǝ:~-B&OP{@?}W@%7!gh\{FSj[SJo)uKἋX k5] ߨO\ئ9*-kT3ŻAF>?f/ަ)%]y4T vN9/?NK[93;@ R4wz•*4,}kpvAѶv/즖)FHW&y 9 >7U1kv*}苧Gጕ[1j"pX4 Ax~q%%o, +ק{?AHvݛL" ՙ僔\EY_H4D=2}CFBR&\csN@?¹y!u4H~5[1n:shV/&{6۰ކMɾێϙPo@S1-:wZ2mɍ+lP={/sߠ݌>O&Z |B-DA&ݹz4[KѹxL!R]ueD%DW֕Zi#Z}Yѱ>?^h[U9rxoH]֏Q2\=i\:H1ʘ*eļ}gZcc:G |ǻHy0c .O鏂xx1 ـCĂGBR]0Dd^8 }t0DcWR(8qM*kh@4>M?)#R&{7@pR^p( W2eF*$4ثx~ҷ5K AQ, 7\{Kd|v~`kΙΫRIB{BM_oQbυ @LMOJ}Q5qs.?} ,`1VU"o %0lpM+`vC>.f?&Pas^N6iʍbBا[x$ސ4cpJ=.m<IPo.=K=O/vX bɦW P=m(qK%)k0sbϼM%[u,+v%{9\3,)$Z+sFBjK\1#`i:$+?yc`ImTԐқU-\TRT+-+E?HEko.bs'~pL=OIݚ(op-\?aE>%`D*3b^hxWѕ4Rj7wLmKDMYMfߦfA栐܊f2<Dy ,̀$8ۼ5 )A32С}8[>ry[ldOMZ{|Fmh \a tdRp -,H Q݂@Vı:EFPleh$C(f28&ZerlLq_!|.1RT| ! V6?W [l~D!Ŗ M0hGcqW~ \ѡ+|H v7᱕)ukyJaŪg/?.6PmH ,ȎX: Qf[%WufO BwM =n=ջOolcɌKZ_"l24%k0>-_VAh)<@7(M)pǃ#T?->C1~ϚaP2<:Ġܤdc{'0C+Ͽ'ɉ7YNʟϳxc%x|sOz^^ÕoܨѳXJQ9drn-ɱJ~9BKmے6%υiwY]qM V W56 A̰;);;z̫(3Eo?8z {x;ؼ=x?:H-DkKv6C3K'& 1~l*fKbb{#MJ"yQۍnJ-ZrʕyRe,h}-g#xbW`Njb[4 XCyKH{4#P Dʩ˴C:sR$U.*;FIJ1v"fy{h(NbZNB׎Gh7pwƯ(ȑ$`59d"ur?oHm3* X1_FG5=` uX}vҘųjӔvG F `-FpLe`qt㝬U1M܏E..("Y\rܪ[J/`{AG7:aJ~J:.G!*sJW_o/ 6/Hr>{;H\ !b@k"DB.ǀBEeP.wI"J^.uh^8)gw]u/-jyw3Wm:].bj"Hߢg~AQ.Hٛs`W%m=5.z'#Z'vدqz(b ť'8wa0Nz{~1'$ b 7mv-f^o v@I6 W0%WJ2/&eY'G?p c <1ι@vɈH3ndOpqkEh0*D_.V27f{TC-HIS.e^g Qp&+2Y(mgAqiK)fc#1@!_$B!"NȑCBFT1Əb}B"7< Xkng?&e?>I,39D$"*[bIzW~b#Y.,lw>#CIZ[8[Ir^+^I+kܕV+rpXEE)wHFA+ 49ך~wz9]{n@3\O =feJy-4s7y-ߕ&" m3QfK*0Vq.'U:  U@qv6|\(JYbf0Z~3-&l2~<φ,zQ`Y2V@9k Zhy`BUYt`oF.u`MTL;nF4[. rx&,q:.R(Q$b!9r#JF y\9r @0ѶxFnLxG%@+r~N$Ӽo&!#+ w1:rITo~3&b Gw(f&5IOݽ&KPTߚۅaA(Iޞ&籂3'H Px8B`=@az43wz xKhb͘Ƹsat:] 1|KdF&;;3}>Xzߚܕ5sXX^xzz;ğOo.<}[j!SuI>/G1xtG1xtP ]pa hPne(g>,wb{# B.θ~c]M&K@jG=P i!|| yo _>{ }rc敖\*5D9^?V A A1(䜙K3˛VQ%|+R4 ,)BATԎD9׋2WlF05PжRfū H*3]N&(;-AK-=s C-},RQ׽cAf w݌wdZy%^]X滌uOSXhg&}jӘ>RT[`&Ǐc&g?yǟ3fM-0x(2G_ z8?{f1?x3y&DKJX C.M9 EwʭP GƣhsVWY-¯~M5+Vzx| xKw>Іծ&͡&,+{ ,'%v>,G%%8sΕX[YYYFľF.qqsĒWBG[,=:8Xd0p4ݨJ~|VQ`M/ULv[ؗqV>DwW dZyN 6Bz# ɸS”Le*b'wZxJVP |-<@Y/Y2) ˓WrUc 7bL(_IOyiH`m5AФY|Ꚑ,7IF |}D7ЛNף<+#EPJx̒|̝ fEJ &JFYKdqpuwpULJ *T{`~cJ* 5 f-*O]N\~qﯾj8-:(Z4nܹ͜=F mt0h'9IU<e8}M2gNuLyX*RlsVҫ4[zY .?#uqWQc?^ ̪Ā<@]=jצwT[QxJwkQUTD󖎂zW.p5cHJ0J5Hq4X3Um&1桿l3Y`@wVYmΥ9p2Sg{\t%&X4R?}f>( mNGpL;^N6a]DxkC V^Hjw r(;Sa-ZqyyU0۰ZoE'd F`tMQkEYN; h%LM/`C#8xlp5S])WSRk"M*CRA)Fk%oɯ;i5[ٵ޹G}7Ƽс7: / _ʴ3kD).lQF鸧0|ժz;D}GNp OZCd8꼷 Ju7E8*Z/:qNő kUF;ZAϫO<ļD!H{Og!ζ?契xi ^9FF.ZpoY@$ho4یZ>,h*,e3(Ƒ1{|D]&q.W3S;}Oz 1Z yoΌ}f7a u;O^T'}]ٵw֮#PDX6 e|5VpHAHxa%͸$4cM0bn9sC>\cY0H{n(SʜCJZY`ɦ&s 91ӨpC0`}nbZ8Z50$B1g S}aB!I L`k(o9Dq`Hr/UԚ K%g˔LY&EX ߬pzR(STN&iT\<C1#R4~o% "j1RkQ$P-͸PiƜkTYbjj+ܔŹ]μ"xĀ,wϢ lrh4zϾfg7~  ?95_.}ϭHcNyel܏48YwLI.UD2w},k csg N̥'{%+1 I.I2%qM&$vKŠQG.X"7<Ъڭ EX@CdALdAL9¹-V(s+tAf*)1YN FjGI'}laO.v/% Rm{U_~>.Es܋+J[ڷAڸm:%VFa 0paU X72؃&nj4o5wWѪ_iTXǥG-&p &5erCC썖 HRJIh)\<-z?kVTRFYЖ"*2TfS:Pʒan9L9ԭr;/ZfWr4W+#PpFryA1Vr0M7QT`Q ؖX8Ih$%unG풡]w ){ ?~hfFyc%MkT@UHڼX/0r$樍^چqJN1cibmԆ+Ԅtl_.HC{8ǡto4]7 }vLHhHt,x_Tfl}>rS|)Pb'X#Ɛ4$%"l@Y0FFi6Rs*<>w23)A~g>ĐJ@݌!+SEk kYWJ~$\TZE븠Qلh-~`4s*hMNBo"*p*lj3a ֭1԰ݹ IA"Dh8UY*Ř8|% /fFzϼ L-cs:o\WJw˷[26Ԉ\69b+Eƙ/SfqtVzo| +ÇPfGMxpHBw3F 9gA>E/>l-W54{vXLJ4H{V[ec˖=q=j]\Si)N y0=w3]I?u=Wꛞ,%=ÃxѮ*Gmok07]Dzp.w*2)(:A[aPn0%HjzTK:4[\SCGՇ݆i'&ja+8}/~6z}{  7!ĴƤdz^_.g0 Tk>/ F4Qd!%'dH!B8(ϙ0C.uDw6ٮ#3*L)Mq.+X*7@cA>m N(QI(9Boy͠w'u:o *qBPuåTR3d-v3^'crc \**hFH&5 5+m!=ԗ _ RXF'Ixo'o$`҂s€v p> O>#,}o$%^`H N0Ժs#%4I3oHhVkgBspAke%thUC,,;voSw{\-U"b"cBjQp dUh_X[~f"Q En,"\?릠VZaFnDkGpi۴*l.y"G<O)ݥ\Sk;)BuЋ5zB6Iؗ-u^}FG\ i!p}ޅpmS>|˺i^ʥ>ukAi&.`֭eMօpmS%/TdA߇:`,/HU@J h6"|,h6hA7غî\l:عC׃e}[-JNM̀)ny>%=Y.ؐ`Z F89^<_K.N~\pB5 .?nH3N O `/&/l[س= h!}h6yZ-uMd ~-Ǖ v8zKj}y LבZt!a>,HxYv4@/O.aGN{i{Lh{&x]>ɂR_0bkʅK9/VHi8Ȣ1C7!~EZ@ ﰟvu Zq-][ Ⱆː ^>^KwWv-{)/.o?\>J ]R 2nK dj$&ިƽb1I \A)sXFcU{GrSV$mSDd$<)u›GzRjGj#EԔ)5xlA4?תz-"͗_?%gs/c LqW?.Ng1-UR*|̮*|, ; 7.\ Ak) ~5\4uV `h,~[۳-ּպڷ}۳eSQTbe9.Bh9V<`z6GܺPǿDUe)μMU* .y<p9G5g$,vRa#'\ 9#u7y q*]HTʳ}XSZ=鞷p@CqR~X,sX,ϝ뎖{)?p:PH2*ú:NHb@iYbϮޞbv^ܡ S򿜵{4,Rk1қ73Ȕ~uC]G S$Oݝ\w@;\nAR?N-+1bȥxG}ݻ_w",-m vӺDBdw29M9Ę}tqxTn\=^قQc}3'r9x>VtJ$@1yzML8)tށ[\^IZ>$QrQQ#e0F71sqq\ A${"ե"qDM»v xW8t'} @0Up<{Zw>|;i]1ΒH'kSZ*: #4$%4Z&6(}FЕlM=ū/q%um֥E{ q}j$8%DS Ӱ>5/\_BK)7pua@wO݊)j)r' ]w׷M;Tti2Ig%R })xC\)U_ %^n/яB ׾ŭVpARN*/(^VYYR[s$q,pJVNY|<͚c2 Ь̺i ]m~Wv:tV/gv; ݈ɢ7wwH&d{lkDB FgqRTj}M,ɂިH5|(:OYTBnUǫy]g6I|dmiߛΞ'ɆPk&)uzN3/.wPBӓ\īEG▧KdL@__*u_'J%c+gLY7" BO$ε;llmo^&Q._o7Z]~5A;pZ m#I+,ۂ<tU.[sv)IkIԒ`'}g߃'SP33_HG9O  [iDqvA mj}J"8g-wE8ɂs ()eD%3EuRĪs Qk$W]B=X j囼VePPk`NGDv 1oL;kYLUz|!W1>H#\q(pz޽ K[7* x{<8n'9b% $a.y5$/;r0g~W+`Ϋtq(g.FSd%48.j~AΪR&uSW?!9ЄcHs!&$ h 57+0&@D W@@A ka!^0v[ 89L K(+3L kJj}spp/NxNbm2{$i†eXƀdD2`p?Q%#ܢXXQf *\:Xq f`+x6ڬ.μZ50\ԹC}~`>F@̻3WjO*եF+}}9֥Md0: <_̝~w t7; .~;q#honvɹ!Ӵ7)+KKɮ_)/[`Ҕd%)[Pέ'Yv4iABCMMɛ[] BL3n\geJ٭~85a!/D $ u9hAgC-ЩQtv̩>;-v o〘AR]Sq+"1NPH>zJN۸scېk#EwcdF{LIIIGMU7v Z5דa5ݐ)N *xs|+˨`ݞDEfo5}G$djk {^׹ йI8xVdG) pD4D"VEl~|UN  ,6nrni hDFΧҰEbJ9ТޤPxxH3b0K2O\&# N(BD"j UKΓưQ_rHRh.B怶\S":-ѽְl-3O7Cx}Y`ЅGz6M-}FZrAnTS:+C}-ӧ91Ώ 4d! fML(i}ւ3*^lŨ⠋wk/h4,ƺ>GWVWwbt_YwEUm;pri"C{U1نvpwŃ;֐p_/YG]d>r΍j=Fۚ+1OUۍM=NVAxOV19I;!!gyT_O"wV?mqk݆R("Z<)@Ƙ"Kzt%* Ϟ?y^e*g ɵIf̏gc~\ɗDFh2Jf4u74: P!#M8œa8Q; @9jJXT#b!S! =In(@CG) ն&v0v2K!'sI3)X,En8Q F7K`# p-08[.rNgLiT,+~"JrsJ1Ic ' +51λ >L(b%A,X1b Z$AD٘grB5Ԍ(ݕ6[u h"{7fOqjtn{'esws~͆R4?˄,BWo 1~sg?^˱.qe[E;)><6tfOsӻG?Iu,&nhd(t(,)C񷨂ި"ߚ "آBlzi}1G5:\Hfwҏ'ߊzl˳ZQj8y8>'esc8\ԞZ9nW{ )'ɽdσ7mDx81?N_x,gy+S=8|񺮜אSH>֨ϙPrBVM%d|~:͏/[$XXڈ#4aIxo&lZoפa;]lդ.bWIcԝeQu "C}=KkJSx&^v<㤔" ru0:.=]pɣ%%g*b+?F"6gAM :Hʲmkh<plwA+q[>G9{7y !1ļaϞv#QH'8 #acpAHtybWB8J cÔ ,TXapdPӌPa)H4fYYa fa9n9FnX^X!qva B 1>ANb }WqLgӜ!dޒ| Mb˜ ;\'Oxz7]ޏsfvxvolpWuC6ܖ)g~x}\\?'2N\L!wrZԍ.CS7U?Fi$. >ھʆv{ڕq7b}μ0`/#]|WpGK1ϕfd׃D md)%D.Gjswh5,bF)0*fw#ҝ~>Q+ O_75,/BݶF}O^!R?KVp5˂{^S)RU0{ա?#ZVieZRy*}ҟ^/mXߺЍ)itl<B]*sT\J6YV sBsPu{q8_SJI_zμPB%iIy#(P aa nV]\@hâ !?_ֈ'{s߯Zd%,c4 M>/T{U{U{U{]Tzr7b OE ;Dslce &q,D,&63 =,In |ZV=`^IY+ݼ-%kwݺ}pB $@+X5ĂҶŭ!-f#4y!*HTM&SuĤ`yDsGs&:C[{hz #tT5Ӝ$FB8 " @jU h Kr.QcX߄ UzފC:ȯz'4{[&v<txtd=?8HB*lG @Up_΅$@dWBgT6ڍ1 JFJQnHi2J ̢GrvL~̃X.NF 8A5ZoWb3NBA{qX|&#i??Ś{teXɁGtNĭ}"k<&A Ė(XČ$$(IscA ID3s< &Л#>ә O߆S tf+wÌrۏI$z`^tX:NŅYtF<ӬuQ:,}rn^g=WH.z2Bd,c7Ăl '7Fn;7K?~x{W?~suz5,|O|xw>]{{Yw{{:|p8ç cdaQs yaL4F{?q)ǻ aRtp3NA0]O$$By4}wڽ%_]xgN/޳w?nH_6"llvmkY+vFl_ G*VXŪr g8h4:̖{lfrp1c0e֯/O瑙)ؾqM PE`>Mޭ[6E6  #C<1 s]zqvߛ~w2_BZƐ]VZZ݃H\dˍÜ"({Oyk]"JOjym:TmQ_OcJ(eY)G`~?I>bpN[;X՝l0sYM ں _} aTW6$u㣢]>u.')!ߢRsNhض(/K(`,E'+kn6;:BZ樢D lyp3=n;"kߪC~04\yVe} [x2Kbt>#t>B EE}î`G0-q0"`uzYZ)y_aOd*LpiR'ָ`1v=pL;='Z.g{&k7w?y&,1S귚D9DM̅7v ba331f2S%f*aEl$HȈ(<*y$. ^N,n s@\jY9|-gCh%.];ċɽ./yd1ժ kwr Vri'>bK Kn>"-O =!<׊=܍XuQSв G§o嵺(_}#_$"G`CA*, 8W1TM8 F&A5Lrr(Fϸ=c@}QM.#l9L `g\n5=`]Jw~ YxWhPJicƬIb*,`akP0Ɉ%abVȕd y bT_/Hדb? J *IOm {^XSc"e,рasm"+u&Ćk*SD&PKm,h Tvf_1ue~wvKu #j^8R(\(?*ڭCxZXMkj:$䙋hLrOh7v GtBQEuyFo3 vCB>))Sdcױ{/du ᨦ&Q?2V ^'{OwX+t=|]Ww.:*j-z=h\N>_ =+֩hal(Ȇ@tł%=I'q!Uc$]0 p߸a> 3SpB]s|'Y vlfii$3ax5mlM~* B%ۧS=K*W.gb /p쉿}Y6D~0\nhGfo 2rB(-o="]_W\'.DK/Df;[X̧VyMݥ?R#.2b\&*NcRVPJ[ճpI{WjE_f>#a E\6?JO+dI?ʔ%ņ F~pyE.(g EzR/zR\?'ld~࿨Uc`?{jLB@PPk޻'J ڔx6ox95 n%+zհHU^UѮ0u^]}%fﮀ"]lZRWڄ2MޫoVa2HP)[ gSO*‘_S\|p%´|6Zctp21W$ hraŖ)nRg3m#d4q,UJ8UBKP|QhC(Ee VKQD-B$(&",A򔖘Ibh+}Qh1a,C$5EibQ`fD`\4ALprm`ER%=ђcA4BofjOsFyVd CxY RpISM\`w8+.aHz$iUJ(QLHImhu+ffʼn0f$6J+./*h$ҺEV(,D(dD#KA]k %TH EɈŔE$8q]'cU.י}X-`•~x>Ѡg KϜnW xuۋ}Mz y=|3G/ Aߐ771Hӊ2{+0}ݿ:3Bt댪!3pNZb=LhסCt0%x\fq]8K$R1y j`ׇ &VJמ؄p0pVT*Ak\,pUC P '\x#s:v{겜q=lq$Wu[n`XQfeՍs~N0EoT.ah0o2cyn8K΋E(:mQ`9S<~^-tKrԏ/\^m=L 7K5I]A !Jՠ*d_\okVJ]Tu'_TJˁPˁjPBEmTKՙ}ǂTkĸbQxΧs~ ^n?WOTji{ ԍAݝ>"ps ūV=Ė#̛s] #Θ6ZGhQj|.pG'Flz'{mz' >;y:'og|yr%n LeϸG:v[c=(|)8 S uP!Nh p|t ,')vnQ@ڛTⶬӖja {Gj]:rXo#n.!\DCd CP/}b'>U[iaԴv {j:$䙋hLJ5DX ; }GaxbOVE 6,bxk범X @lAFgOgiFa<.5 Vʑ5 Ag^ܪ9DBk7-6`7{yˆ%AQ1ŊK[],Mv"Y"`D1+'M&)v h듿_kpYs>wy7&u̓w߃?lݦqB+V< -9*ZdN]*a}b~3 N:Թs)a3iӱ3'tHwV/U嬄 ;&d*fZu0#kq0&] ^Ma#"FDͷ B` xR~r^^~f߾̿+)#w5Q7)Jɦbf5Le;=R>ED}^КE)Թ7㥵IE)U7Qޗ\\e9CUȚQ5]YG+D3@1䁐3}dKEQ&,fU(QtU_){-/kUWPCx`{̞ Dkڲ1>U)hlΣ`Ba=ZPm&0ګv@@F;c'.?\씃FЎk76ҽ3قP0^A %?IZR/.H LyZm١|w¹M; j "ߖ}"U\ׅ2^;>bV׬]97wd˧\OǙ.Ǽ7yyeQvJ=*|{BooBǙQ0ZNdy(Bꤌ.$w vt(!J0Vk&gpC-)AT0>AF3v"Tx;V(EYԲLTv2Ǖ8pRlaˁ'Lo:Eԯ&w\ŷpA*n;BkӄZ+)/ \cAzUt0"\\SF ֎'v c kyw*N).{Qό$I9W^2=4-k@{8S*{iOL31 hP_떱RU(srN2G CL|qBpEBRP)ЀB?mҴ*Z9=bЂ B(A;ĜeDxe٤̣32u!xO}UrldO)0eS Ii',DITOhK(^aT!r,rDzBEBc DN2IJ1EƮ:"ͻjn>EHYSR]gǻF2hi%\@?.( 寇ao؛a'lգG?O.#&DŽ!kϯOFi2xbrJNj#B^-~׿sagO&\\G[ .Pj=< ZdsNFܟ;=($ =G0&L P/mk@> e` j,5Ϟ]FOToΛ 6u4Gx~1C6LQQƑ Nc͓۪2.}v-7QƆ0^&ڄFtۥExE& Ҍdi4^{; _\w7nh>wJ`[j /~vzvj}f%n1^藌YY`C7ﮮ_^ܾ\-n'-$^25ڰ)~t\\OJ5Lj>$|$o嚡3r }+"`X,ܗ܏T2pwB2Wxljo?`Řci18}Њtw7wֈw6Igq:w+ͽr{gRo396\3h;Z^|ao"iV_^3ᥥ,0 4:R\jggI Q(HtILUSd9%EhI^\&bL,AqH>DP :NZh64 BL%MpmўB$)J㔀$D7hfVƪfgb8 aD6YI\A*+( 4BԒfyU,zYh(T @ T)O1Rjs! *B];#TCrUr&9IV P@$G a5#j}Q(I2xNMΖ;zKʫ ljxs r灲w˭ׅ?(h+T#W{35煅/KYRery#!_ɔTOà`J1w4n{\g][r=Ӻ!!_.SZn<(MAM>?w'_ RLaL]ɪFӊJM<.aς %c S';@AH# 1rn\D ڬn-6{YX3Y~jDIk&kFR-Vɤcp^!6v2X8p4]Ui'EjBzi%2QU*;&pMDFJ9)=LS{)'@KB̵̡@ɫJZ;= JQ B!Qb DzL8Mmk@:514 w.Ğy兤Ve .6F%)WȤsA%L&ZU+ɧh[V@{}mތ7W0> Y^OO^~ӗ/+Ub'Oʬt6m!:$l°}wSW{O;?V OXȫڸ;5xぅ| j$X~hhxr $wyw}y9w^OF[G9*]C29\Mv'Яrw9צw^`>Ś[:v޷܍O"W^`6_ӠSf 7!xF!^#ûÉ< oyͻ6-%@+tey7[5h!K؏4ެ.(z:3y 6&lᏋ|B{; Ց\*9lNVq64](k Y ] # f=jNUc`caa독;lY莅YBMD{('{(ѽz2CAj!{9?d쁣>utw?ѽd샣ꏣG]=YOF5KNre sQ 󆗄F0`ڛZXz^Mg$Lk]N!ڝ^?:G=;B[^:G5)8*^bأٹCH4ժ4N_4e^K+kӔFS]Sv;3דJ Օƽ RAps١vm褡\yR#=& QR2"Q&Ojtmm Ի<6FM/$ik>9J у>x>\1emI:LS)JD#σJ1V8KSxD!U*Bv|2Q}էN:$o}پ+NU%+#v 6CJ(;ֳ]u<RW3Qp*9XO#/}i=wr}us" &4x *Kp-1)]ߙCx \,CXq(' ͢^BKg8QUO~QK5I-idG 4 ÿfa`Fyʒ#,m_e ! e`FVyVE)Z]r}!¾Hj8̴\<95.ŧtr;XL]~{p/Ry JqtBBhYבh8W^q`x zt~)J\ if7}wM%)v!J?EZ @=*Ҡ2ڳRG1%ke*sQy)R;.jd5?*'54bΫ0-iQId ِIJy:Q` lTT< XD#7<`YM2r?;tν+ZXFY6֢\_F v<8ex-& jx:[)x\Jx>o=` p6ojL 7˒'kW>75{5ݗ=*i`#^`"_)0Vn>3}ի3=V(Ctx(I4:u8.v쿾Jx(WH13Ubv@jMHvH0-\Dq0p;/1LFĂ.cR^x.J0FZ+k[ 0B Jl*9.YiJ!&տC(ګ.fbGmX+L0e8P2΃ zp;T Lqc]馹]MhEuϻ(Dm8/GRKKlΞ+qGD.5TVB# P2fGMp HrU5y|i5fgeJ=smd79 "j<@XU) TnTȪ#5׬W!(긑Gqbi\`Zjqf6x/1I qfB! 7*pyV[p~Ź8IT "=a8 Bᆐ R@ El&n+6]zxkvswG/ҟER37;c<`fi%^@| !+ [xvWܹ: L욷=wL$09,Bk!mᑠ4` @kE!bgO?Rd 48Aq0Ȥ(SB:nU&q;ӵgz+} !;X)%g(q1ۖ ĩ# ]eޏ[ݽ~̐QT3km/(3Z^)4k[T Y h]-`JXX ԁuZaa|,q& ,UH9{;?hц\\8KPQ٣}~}uLnFwfHVґgɃqE;׶?rwsڹC\qlg3â Hꑗ +wV)FU`}xȂ`*ĮXϦhi`>eg񣬇UG^=UG.I{3 8wE)P)\G%Kьp׊fPڳSaT+1T3a=V$9*NԡS7z^NWtNѧ` ~2_IMVÐ&LvX=h+2z .FScj0*խs6;ٚO^mʸG+dtb3}:c>TOe s\E=1Gq4`DqOQ{XԘHֳw\MZ5jҽb+@CgSyы[(J,z-LD#q<=|z gpqƳx:=xLs ٜ03U RV6UlSg~͢%dmՏ!qE8["(Z'VA ')LyDLE(0uaʚЩ0Qb-9eٖ``u8/ 0gޜswRJ`XBXjA1)iVL^_MݼWr_g<6:ٍF4ˆ`Xdc[e_3_>c.yЫx]Jo"ji1Xq4ߍAoˉֽmuL OgOҎSUkhOќ9*g}unh7Av e>v;8doݲݚ!S9:HG]foum?{ڵuOnvk4[(5`wAi6gJ?8T ;z( ?rRΩlDkr ^ _m"7ov& !ufF/6X3 ^IJQ&OTK*r-J(>/4ϗm?xÇZ}Y$BQ|46F ޑlt( 22_ *iQSVDhTY\uZ%Uo[蠍5G3vy܅RfOVm~ }9R){']ԧI&x-Q7.C>Vg<ܣbvyN6RjL7%BAQ[Poe <>t|<6;\{us_kR/b_\Ĕ6$q57ClLe94ǿi94yiF"8/Ѐ5zsQᇷߜ%BPRƷ?]5WL#7,YFtbX5CG2H$,{1. O(.p1(xe;#PU):UG#&ѳ%)>^hl#ΘNTKˑy+:@g̐U?DF3B ޥjPIql#Lތixg̎VħNTH?֩]oB ^t[\)cB[bA]PGAkUx&[>~+⒴^IEEr@б"W,}9g{|( .. ;wG fĈ2BJ0𸛒1mF1n+wI%30Wmi.=k+ަ< eI(ܬ##Ғv9EluKj _4SݽH+:bo/ކ$w<אsړIo2}jmV[cԃ3m[HMuՠ~sDjg 4aXw-rOM8WwNv1PQF>]v+gզ L24NlgwLՃ EQ,$n5ؒ\u{HNҶ\9w[dB%(CNvWç7`bxwkq ڷ5ŠW}s9C'vꜪZ!;U׍:Ac>kۍ#G"ë #H~=؞< diGܺw1̒lTUYy-IFnUe #9oGrEc6OǍިD}/_=Wt;'48z_nKAC-9J(@Vf8dIKF؟F>SZ]A4^̙1<͈4s0sGhPMp[F `#U;U*-F^ |eAgѩhlM M4^b70BzOC5v۩|p: 84֪JÖwB+ŵFݨ XkYHNWd>V,/5[On쁙Qh}P%k8E5cl l `7n&{ ֎&6 D8HF~xhd7O)dFvVciP\{`.肫59]h:XY`]y2>:HQ @kp#>9TԢ49NMN/բP뺩|zSv=u=xupo/jn?}oRL^imhd߿MGt?ɗ^7WV]a?7G->==9M)ֿziuy[A߂_gglKL!!ї! ׷sClѢ3ԊE?V( 4>7PɊȺڬl7ky®X;}eFB1-+ XDZŠ#l%DGRѫHnD]1 6iViQ)m{ؘh z*97hч=I'k .ꄇP'l@4Hiȉ }ZJ$`nWS  [PYJWMij$ }:ѽT܉Cwn) O~E6]yd+.n,r{y|:o/x5VE\#f fK٨QFqs - ]e"h /)(V~/[LJ׳߽d t4e e>q[]<4ᶺxևwU{@4elc\lݽzY].sNY24u_07jSUjͧ'HQ@MVtU !J2\m|rz6L?P)$ѝtJ(D.½t$mXݩEQem֨d{8jᨼpԖ LY0rcA_Eᨷ޿Y po\f9f"9C6DM#E 6<ڛc`p;`36l=up6:fDtJzSb ls^i4Fn}bt{[bA3@rMU( l)_iIsR*rvBJ¬ B%U\+LԾm$ ;~s*nQ v y-TtNih } itI-2k_НdR[ZNۆVB;g,$s86wαSzv1/2mmeQb ,n'!7PHrH*q:9Zbe h n{=f;ޱ|"Mai$-}#j Ryo$y,i6nRY,-pQJuX<>唺wGZԼE-Uu|]F4G&Xx1/{Bla~-&7l:R#kG4hIb:.{JSK)lg9hV Vn .zpb+ߑ^ah6sLS1MNt]`5I_JW!Ŏŧ|4FH)ZZmt|ike4C+u?-mHuX7gcuz=l7.Iv;pG,LqeFPDGj +HՑT1/6 ^PA"k Y2x?mSk./bЌږÔϋn| v/h<#6eet*OQMZ@.X )E!I R{ EZq QLzG#ď,ٮ͡ڞo`MEq˱YתgS~ު !S>H?@j Hꐢ.QUeZ>y!tӝհ!}F;HGȰ̽c:q,F,ݩbUq`6ȝTduVkkT36mݐ⁽҄z`kꫭ9SP"WH  ՠ-tuf`!S0o[$,ㅕwf@Ey[Q˜% nHѹVԂld)a2ySY=iQ,5 a2%ffгHXZ:Oҹ i#ښvWbzZEx.rΣK"ZHm%G -Z32 |Yt^x1?@ 8 rOh,8R՗!#<C8뎲3|n/;^+&tEP Pnlڨl1F Q D|{1Ҭʨj+bfV}Sh8s[Z呵Q_wlv ID}L,xK|4i$9Wa84< JG4 %Y (vk";6/Ѳ`L' mD,^bDGڜtFo0*BZ@MV5/O YxxǢ!7r^eHl^Ha,\7<2Z=]ɨ;}vɨO&&g}8dԍC*aTܒQQ6hɨ|yKbhC6Vɡ" в(ܫXƔ \,-ܓ/CW)] (i=Q걯+M'3ԖgRD8q)owkpOJ{tۡGmĦo^]==oo_=˪7^3)LY"N* mvb:M䵒PX3M} 7VGЂ}кЏDgW7GuG'gtyvP̳o4;~:,iuFMU X뚺ah4&4;CzRWU%Ұ#UJML<_['f‚`79>Kg_&cU"w𒚆 F{z-1r_ZHqh6[⓽alC%]>,95ઐ&4JhVWqդI/-Vk'19ey0abqo~,Jv A6Svc'E&RIZy D DY3£LXXJ u$g~ j#G20`N' i4-|huΞ/UR%k[uU=oi;uv%MkIuUl;WJo[>v *j]@q P%UWNUQfU*1b\rGow_>\\u½ѣ/zw]_ij!%gGWcN>/ہZ! &`S૨ucciHĉ`c|`Pi/ հq7PZ8Ȥ Lq{gmk  jۤ?՚tʘgښ6_al* e3*=YJ*9Ǯ}ɖ ͬD)$I?= %$(@"].ɖ=}&ha ]_ٱ[{V+r 7]7}" S RQ0)a)//OUw|,g䇫B<>GHok->lOO^ ~ffp2-!&!gUFU?sѦa@[ʕ9FT/z[y{DQa%$Iy2w)ǝXzqlhfI#"S_B&  Af$_S u@Wᩝ5h\je+Q֙PP 2 3kX_A2n4CيDtG4'hB@q$0KƃxA`@ o!B+clW--efWt"yt]gF~$j]Htzs7=jqSrP+ QV<́޷ z^wx^'2z~SuSxF%",)$))! s,`? ܩ2fHFL%VWk~Soj*ݗ?}+I9DAT_k!Eam~qu%a>gE_nHfp7^I7#b]$/'}HF]D㱚c:f-or5\tAgV2H4Hb߫5ocMA֌!Jڣwc(oq[\G Q;DisP\͞?֒`$G\ޯ3 7n~L$?{J;4Ʀ" R0HL^x0~πFU(F7Q Fs=D,q_h6&ۧh.}h#&,E3e gϔsAd1f)>2)>Gtp1 CәPMM':@1@O|6\3>¤f#x -˪A!. .,Ͳ43`L0'P%L=RyzuhVdih4/.k@/(',]X;:>J_c*_~G0]}/ceh~k4(#Tzcf]X!\j#\iLT6M'jd~NrɍGmz0R*'6 4:U1$i2|t0nHŮ S07sos 0CAHS4JēC,d@匵²8p)u6!>&i ).2C,fc`CWFgS Io pFlwwWWHKo3Wth<~X?ߍ|$Ӄo)7$>i ymAp,Po6^E+QR~a=5Ɯ{?UD>?9*x40xn.KAJЖQ*;PE翌fon?#M 0Edٺ9uOcS8r[ܞ*23dtj$7&0Vd֑ 4@-9F:gJJO=&FlͰˈU jx|Wa?9: @Dp NXbcմҔcaꉽ/cL~c_cbS:>$*e<9$(t*KSq$ԡ5aCVni:QTf8Q 7@Nm@BBx/DpH)f0u^Z-((y3ʈ6"E"5PCY6~q%n ,* "g)"9QkjhAS?D5YzsIC5oԃ1Cu6l \{ XGqcMSqq܈`{r-ģ'R7BǒA9LRpe")( OWt Y`4(@c٠!v ՄGmKb:⧋ƍ5(3k) VBTJ)(5* a ƍT{ 5Vi"ABBR2X*=Šڌ:aP7jM3RtA[ )'j C@76ʈB$#|j 3 ou`urm2-}&rcM8>r5)5㊴=ѴYʸDP% `pSTs@ZA\3 ,nKL=WqҘ9p.?Wv2JnN1w}'/Ga` Nyq AP&S!cLrKd`Ll+Nsq ! ` jZ`8YF5FmS#["-*\_XR5L䴻YEt%JhW77-⦠ 1؀s/~mγ>NOjmxԝ™:M,kքPg2P *Ef-z+[M{8[oqfbjZZmPB{j c/mb kOARnX|钊/?I4ƨ6xYd+p{{wY؇+(y*~6?Yj飅嵿WSGy{Vw:}_j3s3&F=Mӗw|-rKQ™GݿUSjyd$EL}Z5톱vkAi:G; yٚvk;ڐ&2%ٷ#čhv͕D}-gz Ga%2%Nbvx&}XYJL#IB/>àݳ`nt۞]Cȳ%Ic7Hţ,b[F-5_dFFDaNY6f 鱊)tw1:v!ꜷWvb?v>O+M7*8~"~|Tk rv,̈́OSrSfA*6d1#=u};v7w|.! i#7RiŌn?3댢 cvVv'9僣Tfp̯/.^,H X}[4;UYڹ8m?)&E1:EA(tkt7{A}. gtUcLiy)h&#  SN{V/?=`Q ;r±So075d"jRdd ;4W+J!k%C E.jaGiF55V)) jlϮXiS^cKeO+=ӵ `ׅ΂lo9 6 WoJ+Tܠ6&2'Zޜz*|qc<;sQr.=\9EH OcPA!Z֟o|G1F`lx/X ,8`R4,zaz>Xz4}dϕ¿#-r0S7p0a:c#YE͕X؄-(Yn:gv{j ;)upb3g05ds'0gGcdjF>iT1;7W{%![`D .q4fB(4hDӜoorr{[8.b!C.^z檮P$B $: ޭ,bG 쀄K"Ҭz87 aAF `"z6 rcYI58$5dHj%?u[1,ƒRzdZ)⌲ *iUdZ@Һ*a((ZxxC91TծUmImEm׺]efA/s1h5aTV7dM`a:NVM̽+_Z9ȑiPs K))2\݅U4߼0,(XQf9+Ja(kWi1oQxvI625ɗRL3QƔaGZ*r_+v DdOL$EZ1TrJD B:ݲ,qIB4TiBD{yřc# i< J2(+D%/ J[r ռToh)-BldlQ-5^X+Ԃbq,HxGuizzh)|Rh<Zkz^Z3ѩ7V8<\#-5)p%GLKa+F:XE]ƳۘΜ>GTSzŰHzE-Ls85&iQS%J*9L TS8R:djZ"s R͜Wt]߾hr8UoGh׾~&$P5VwX&n]:ogpO<\9Ā1gt U޲~ƢZf_.p727Z,IEݾ9.=̯[ 0\?WJXR1I2G9PgjqJ|n%}npUn1kA/S̈ӭ vzQOdRJIAj}uzHUC T2,ֽG-bHܪ+9W|\"or vk{rQ8$lYL LȚA0oǤCaS׸,^oGqyFCukXƨCt4{]!(`nMZ2:o:^%?-6Q>FB>T fb̊+Z=NFusQJ`= j`q?Ԍ{P1ƩC.{ӼΏm)V TaiO]oi{ƻ {?{-k5wZOj=S TÕ~Vqu.Wm/7bWWM00QGֽY,kK,ǽ_s$?w ʒN4o Q/c2ZfZaR3!hvz b+$n(gd 7ؔ6J_h)lvV\=. |ڿoR+0)]1_ Ŗ"(lzH%nCBiUno\xull",y'cY\%rU We|<%#Z) "eW-qH -S> c} S~4Ŧz~{70>%ozUӁލ>L18|{@PBb:lFmƃ({@S-B6>N j>\3(G诽]>=򏋣/[.cŶ?>ȇo?>[}>+^T仛;XdnryFcDžBޟR||,.S<.zW)YHO|"pm"o.& ԘZŮ6(x'# -gU0yˁy 缡"0XfTՆC>O$G/ڗ6hXH۳$8xH}]}Ja7m!sJN×(6F}LJ`O>L&@ ~w'xO)^8эC4a꠯\<Φ_LL4ݟ>`XbZ7V  B, yAi5/>4#oO2Grn?/2$ϲT[Պ<N2"BA:C X  seOf8fmQ1"2 _g`p:@DDڻB!ǩ)jkj4 d6N]MAQZ|ㆣ7iBXfl"  vJ=㦄wrD-f.zιRV>&Z09gyoZ4ᴷ6>zw»b!CNPUvxJB 4eu<`Y0 2ZZZ Wɺ;NyXvyܒ86,nPOuR @Z^ד^yz`P3S:^jxg(J˾&[br[dpspa LBYjJZt!kaǖb=}ҔlOq1#F (yKlU^"c;bn[,dYekC~kGރueHS+cy#}K„FK&06#^=^XfHp#V@jP-(SHwA\"-ukgݖ,5Y!uZp5l+fxpEZQD Db[9lopm ZRĀ9n_7K޿ǷusB kA"XYxW1`1zB׬/f}y%pjG{Qk]%՚zRu-{;=y/.(wuq,Gq8A H $w^KwwzFidHz1dŮ>U,:|!PtHuAZmy@jRFަ{Ӌϸb:{QeoɩӢE8Y4Ҫ&oFQNJ%'>K _Zcj[7.Q u`1Ee/+rC;vw}7xYMө}|^0;OMZzj7]^g?f;Ie?ś*w'F,䕛hMp/jZw+ tJ1ƻn`1Vޭr-)%&n~ @5&F^WO*0ѫkvGӏY)! IPws2f-24Ŝ[(w3sEq3T>B1#zt։5 O)ˋֽ\3vr7~M \!#EuuUj l.zRʒ%*#e = 82&BYIm >EǔY ѽkVؗZkHb)}ǘR%S咆35,䕛ttRRPhƶnv@JFAm'r/ݽ~)r +hkY:pf8L<Ӊ}z@ci!7s>Z Ku6H1{GtuuN,J;=o˝hF*>EoN0%.oq>꒴45uہt/lPGouϧ:骜L)ee哩R-kSHV4^v_ZlF 7BҊD`g5 7 g3x}U" wZ gɴgaL+B# 6d qҴ\ д*ɵH[$_eCpee`+9y1R/C\(# 'qES 'K.y<4&A('k%M-Kl0hh 0Klb|>t aepޘ9G94axFX&/39 EZ4[8>v n- GSzϏ ulZCjn"S0 kQQjf@o#x̞/*/lHuZbayK,>9#k-Ub$xѰ@Lbz"1MBj*QYm z^T69F'Ɩ\0HM/,[](%jr7Fk(v4h{kc$"= F֘8r6XYD$S/@7/oItr=)Sph"_';@lqj1F CVk`>.컧zdM%9va߉Q {FA^Ha@/_)waGR rucLGN!)beW̎25,䕛tRRm-fb#1 ˳%6FG(S~xqؚR>f{G5Wm{{ +5[e+G伹 ^"_o6Gk;]\Uo6jOWv|͏ g7J{!6rxf31{МW]w^b,Š ~i W%\K(i9qEYy}L6z7(lB{;-;рZ-%U](5Smnu% Z"]ԟwdG?^KcjӺː{ӬR:Z :3A0вCHQ[ض ^xU?FFj.>Q'+G-~%U#n+!}(f- Az}d4×!(XCmǡ^ +):AE挠*bEPUT,ʢ]ߝ`A+"#`+ˆVl+ryjzŠ)a6SQIRx ^$ZgT<`\csvNOgߞ6ZՁ36]4 YO»`Gӭ ]ʝun|x߃`/xF9=YwCx==V3W\tso?35k.UFlG!~"pոb'^UoUЃb &9.fcdgÖ}.6no5Ckcs)2ԝY;|#dH[!}{ћߚ WHJ;Luh@@PW?u:3V[fEK=p7g&X;+y2 ޱfȍֳAۧY"޸+֠ t2 Ez'_^}EM^gքEkPQw^vZYC \^vӋ+UӨHV57Mó HAVC̺IVolfeVp\98l+A)4jM09D~63 Hv=K^/S`@1JI3^#*RAH ֞B ? kr^4~޷`Yq?u/ [ϷUT9gXcGvry;#gnWTi\GDFٗHiqm~qjnfTyIWMU}lqj(P9(uit7KXkþVx.w;;]itgtg5,䕛hMaZ3ލޭ,)|(v,7Tڹ݊{nz MĦ͑5xR rLc >S#n2[ y&ZdSWn(]w)YSQt9GR:w+n*X+764NI7#z bm7JF :X5o7g4u9i N΂H{NH(_d?+$S? M֋2g@4 po삫Ѽuzѧiv'#fjWB.npn(e 陨B_oԮf.W3[ftfso1Te2巈N |Ǒ9- ڴoNJ xu6oQ#EvA6(TP۾}fGѱ 3E $DmVRܧXsF8 =T}WCL8nW9%C2xQZhE\P`ㄷ4Jw+--W #ĵR:WEČRGhs񅂒;:SP%Ymmɥ#xh5xgU*F6p\#- LCY|c F"HmmD i"߂I*j#HI Z$HM!8ck_s}0 =F|aiA9@w>c8lF 7BҊyhg{70zw[9z]`<- ԓ^b֋ڰX9+i<K)ZBdA]PN'JiMRɷ:jx6,D&@s@+GutFɅFy]2`"#aZ^@JXGm(G@X|W8ي)x3r lM&: ^u}!P 0L␜.S]L2y֗u/ɻ c5I&Oj04:c|#杶:&ꯗL>y^,+~M (Y-iu&axed# $531Ȳ RϽWW?|# ջGZB575M(QJm\JQZif#&-+xgܶ:$ipBGqF5Z-EQ <䤸jHKC,k_E9KzyaE{;6IBNCn1 ?8p-C_,=lXEF:Sg z?KA54J( c%pNrM׆F 5,䕛t"kq>7nr'o лĐs8lH/Cxǚ!wϱ؛NV}UB%9h6g[ɼ-dù;1*qv(;.c%5ץ |Kn1*t K bqܥ0_IE4h44+8RwJ2 Z3h8r(ct \/fd,b>gIf6twc~|W5qFjor qL,)Πzh;mI7uy'8,xiВ^O߾km#9З܄Rt .5KBfגv)Eph]QSU=է2697بJ F 0IPhb)ИyXJfܷ܂]2DyK("ǣ"ׇHteTDCuT}~v r:"C6ߧŢTDB)3o:S& `a*@ް@q֫8<@v.lICzB1e1H43?s4 pPDXЁ@JalV2Jd\ ! ǻtĒ_+k""]e8}\ F8<ۻP: Fɖ'Y糛q?]3䮃]&wf:c C)%̄#0%2 hƐa5Gx%]/+_Ht9Z6!E_˪'^I?كlrF;9% EUЈI$BbKNdž+LSdrD_8T!&B~ e n;hjBN(@eA(H qzp,pn08LA,|z_XiR(@SKN6jN-5^ @55>\ ;4{FAS0A9REATNJJ̩_:nxܮ_jfjw9ߒF*QCF $-h;e?XtZBK3޷9fUY10 p X7ڕ؅]52FDKD|w$TP ݩ&H?l]O^ȎI1-GlX1#(?^U8E^xqvr15@O7=mUp ]u&HVhvo ոֺiI=== 9^= n)m"?du5# ͔ Ut3x]A\shn݈u؍HӆHѣ7} ;H&%1Io_d^ -0.g-*$ RP(çRrSZ`Wp*pXО+"嬹ef<ь9bCsڮĒH,.[ѯdayRVĬFʸfWSTRuVOiv_r_^\d\<0ȁ~>)&\wb=Ad'죏CJT[6*04Ѵ[no>4qU;}Kg/l\{Gs0Z@6pw3? MF<EF's:gMLt\/6?ƙBB*ndAF6*u:pmΨeI~d{g5 mnkf^bKK(|kAxq0Å8& nG֎5kmPId)|0,šGao!,B(_&}g>Fg ։: ud&F֬@J~uͪ : ?eAб O|4w &n% ‡=1PzLEc -jmA(BJVp[R1$QtCWp$*mabA +_Az9*ОTⓊ趓/3~dg#_"t^ot^BKׯ,W^C-d5xj4$cgVhn0ŕ6 ̅*Gk=v9RsO-,)*%Z݇}4 辽VuO4}au鵽I@oϐcО9lSD5zxD]4Qy-}vCWW,[1ZHOx[л:??qBsZL[NeVWtGu@RUQO\MORA֐GaԘ޺]5m5;7j1v3*!^y}ظ{!B~;zF0t +/\RtH!qpFv5n^ڽ7eX 2Mnrף]ݽ-imx>O{GmN'ZpVq2է<*ڪML0 V8N^f$r}z.@lUך6=+bz;ZHBNΦfl=W*gloW`UMl ͵}[c,$tnnݬ:{Su|Q,ٵ e,c|~^Eg9;|1Jza=3G7T8ԐrͲ) &1}nNujN λlHnMX+7,BMD%.GnNujrN>m{Hޭ y&eSH__{7`t-w;KS;n[M4ǦvKM$h=FgyvC>?M̄Dc2!C7~M8b^/5N&y'41 `~<@'@I= CL^C0 M><5!Oܙ(8<(ݖ;}S`w,0z 4@!{4%đ5uT(Iʣ+)9^F^X mQ>p_6qgk`0jxvh$쏭51" 1X4fSK4pō33MȂf~нigxogu`x;a}"͑CَyL*6y}69mz;zQؒ8:Hc]#n:)wE&J_,(T c`JX ξ:1`}}Kv܂S%Nя”?2 @ҡbL3hE:,錵Io]|1lLout}{T4-r_&?4 v\">7E%mwෆ.Âm-n !ZZ~َD|Mm<`#Z 9^{++deV7顺pX@Q :D/!&1 j3:ȂѦ}+eS6#lJ hB^Y'іID)Aq<{u6uf6;[:ېVws̢l-G\YD7}mWs : ,xysT80>|*?ȾpZ,{POWgf/C_r1˱x(S ?->_)br7} ^>UA=c-khPk JD Z;ȝ A ]asXS-#LP,-2LjdSl1Qo~[wÿD+EuWyq50ndճ)8pʼn$tXlI~6]`#jqjib!pNy*؜^R)@?,vqVJB9h !cBx=䤑@Pƕ^gCPA&וX-1<ZJFV9QDIQbK9T9Jed: z-<Pʻ9 XC,*X줔7vJ~oB+īIkY2+R7Ω3  %D9f"^⟢"H4D`q)y;~qu:GLfd]l!%-b{Q"t:4YܝxL1 -crL3M$/ނn̶2W\RVedKʊQVx~t$m U_Wwj1vif5J|*N:ԡNuMҖe!PƥG݆ +PVCG HH҅O'6R)6K# aţ @q1>l ̶0V#θFNi=bu0qA=L&xΕQ(J).j ?3[RKQ(x!l!9PBmc.bXIcoV!+rtn-j>zDU!f @ 61<nEEP"et+!kEÅ:176#D6&ѻ Dԅz)ɂYtԍ]ʿ a.G*#S~ rl#?JZȏ9ű7IҼ6:6<_WUU+pD`ǻ$rg ,\e玱?G)=eSb&,䕛hMKTQ|4Ycw4^R?#>%mD?܄rͱ)XQLLM)$>GIgk!6WmȮQ cCGd&_{0S ^]`ƊZ7C#1VEg)7<+`*9{1 { p.'a3g&D+*{6[/V|{wZx˲A,-W =+4ZYGyCD@37o0@KZo)\S$|u˜ȣb}SE5~E@ ෸nQNaHԞ▙ܾ'"d0J 7e"EKROʐt[6z-ev S T(gE cHAPUD6~H"V$%bN!K,Ê*S7O_L#5F^gc! c0XZ+ 9֘##)AQɵ5P)iU2J '!Zᔢ JGP+C+(R?m1EN l tJda% 8Ýc,R[( tDY1 wキJ@oDc/-kYZ01ϓG4՝Se3ًӒ^r+qeE6WޗuAǸN2߉u6I_X˓z1a 羛DғAuM> tCVppC%HdEH:=ޙ/9/lp.ʺY}\l Q5_~x 2wۧߟE[؄_۬_8~ۛQ1ciW>Dp\pYsf6߹_"ݝ>X%[0, AOwog?CMqDgj{7_eo+_ drX\_ Ҵu'ˎ$OvjRZ"i5$ELI(ѧFG .ZfDN-EXc̔g#a3'akF G\$&DCxYX2cerFMEmnK71vy/Й= .ݭ.Ec c`5_@ʹN<^&$cNLdM]:eDWMNv^(u/'/8x T*>PYEQ:?6Ҍ|<ٙ=V=1m# =ko8|4_LjzOv~}(9!K9? i~%B09yNY@yFR/"PP <BSt`³Ϻ@#y9:^͠^;*N5=Q tN)S2z_+c| 5-c`LAXg<1Ncޘ $aS)xlxlϮ}C?LYB,JXi=Q$MDjx,Q&`ȔbO;O9771X}InƇYxH 5H+}5@KpEp}r[NrfJ9^VFLj-%r4z]3ƨ!?u9D 2)^0;35EOQ3JeBlMQa]1OXe|˚~嗏Sep^<Z\`5z=)m/tuC}7c.Me/[;1,"Lv˸԰=k3N< s p%u v W?=z&㥵C۵mkZ>~p=v!) б]ϫғsDS9!;eaÃsIPN=ˏ2 CpI-sz,&'y`rZp\~"z5 YK}%MqvoHuz68vo2˴zmZO@P99qgkdK|Vz< APpga_iQi5]{QKclX׌1fh/jK҇o1p6)O{>)~Y[aSOy~YEK-/5w[ t-G㟅y3oǼ>$B1j/˚~m O4dP\ڎ{&Q!:8L- pH:?]K.+shG"NNu~,sDIQBYj ,)֌ p "IΈdKٞyDufX*MciS4Oˆ/(B|B;]B4wo~nNG#U;aៗ,v6R"`uj& JhLJeP8eBP*b&yDs=@(c|4_̎}{m5 A$2K8FidX'`6J Rb4/qJmBAAZ6A?*{z㠺!%Kbi] ױwJݿ sb5k)&+^jWT:YjB&#Sqj`A %s $&B8}FX&ɰ SjYzЯc4=1ՠ6mQ3%&gۺ\dyi+[1)jY[w?luxh)S4w}N)r[`;G^ V 7 vUoaeQ8# y""Sܱ}Ǻur Qb#:c4ng4@N[D}[E4DsHWoO5啺:<C .ߞj!y])%uU43HzլFzmPI=g7h`,~rPmx_,aX3P\l)!\R* ~;P^ovqL.Eٵp'r@4'-PYE.γXΥIcv.&=u[-/,JPKV*G?_׆&<2F<|q a嬂{fL(n&}Z sϭP5c3_- &aFܔlam4 M*]O6)H/XNJA>(}.@%ECjȚ/%(F{&^.]YG! mA['Q-=3ĘbJ0^yU'g_V 7"6"VWzg#DyǨ*Q} kuU-\_o+LPr's͝`}@BA~s'j&H-ϘO*՜pt:Ϫ5ѧ.Ҿ}pb;7O}6wﮮD<}m ^oO_VKyC%T⁒%(bR>Z§`Sh#$"d&3-%.2$C,aZHaFBQDKͤBBG(S:CQF,FE s%\(D` @)d -퉖I 4)XKCS(BE D٫*FL2T8KDE~hQd#qI:fU0Q3 jm5fvYP+u]Kvڽ^| -!3r&hBTFf),%qqb" bGWDKbbKhOȌhEݸ/ a0!u-|O9-{ԇ 5XGF~w@cԐKcKՖ /_ LidkDx>.:qu1U6a9T_=WT"puCAqJ;r&18'ma`~,QJg4C(^-V4UD3]jBٌQV"--r,dS 9UZޙhZ@՘^aCK{ac.|ZT|4#Mݨ؅yKG[Pfvu]ŽTL29eq*NTD vTu6rf1O,ۅ!$|X{vbMFɲ8PS~UL|n(j 5F ܱY%H2BPlHS>e4;NS0eFNEe`m #IMౡ&iJHd,wEJ$J8HY¬!aK# 58)YX'T+bJH$xBF&SZH'y$kU~,_/8xM@$v^8,;W.A2%%v GtBhђJ"=v bn]H+#,J !!+44ԤpfwFp .ոA!yO,<ļkX̬#dxr$:/3RhO3pNUG?C ɴ9Wb( &+pkP&5(v)4$e k:PͷHևcv*%Oet[2vaptA>;at7;08@,pZ|7,L'{z=]ޛ,WB.YelW4yfRfq IԤ8oB'Au?UpfϘ|#Q`Tw8pa5tϤO߫ de p ? 5V0㓭Ugr?֑ ȑ_2_K$dJz!Z7T֚X+ժTw'Rhq/R/7j0y&kc2'g_a_fS8te%imvI,tTL#HQܑ\8Z>/lؙ`;W[{8P]mo9+⛁|9p 6{c;-g2Y?R%Ylj; /fbT}%v1u3* muP;om$)⥋c¶% Ǯ)]=Úc=&fґ'YV+tl|65`o=A~QS_~79A9C-$Sš>w3pQB_]F ) !{s ~WfiiK&uEпry`Tż&"3ʿNr ?)䓻 j7MtoϢϢϢ \^|t7@ m nm1ZBi44; Y$N9{Kϳ4ra_NтaAFŽ~e7n>b=6Pdjպ{vfΝZGO0b\򂖛N{| i8Xz}I0 mr8T)V&2|Kp 4?zjHKq`5$=ꟋQAH5KT:0ÆS]qfXm |l3lH BRyDA{`2GBbL07L4*Efc2*6gX8qṞI0m ܟYp VAhA^;^prRm)F*ꅍV(4NI'H׆r*OB843VB/;e/:ڻyL X>یqt1x?TgfNf۟4wӿ/@|plq3 O]A @x~CK`ZҠi9F'ydS#.CAyDjR2?Ŏr>+4OK{-p%;qNI9hř.LqzF~tiSö/WL#Da ¤*VBx  p͵S >奛1< r{zB}סFZc8[?f(go0"q,9a"މ;o֣P {Xϔy⌽er<҉v,^JM;/UG8h]t˲V,8p;k6 2R#z/ePj_Ϫ<:"jͽ@-5^)-P hJfX6Űm|gsքn&f1վ|H! F1FБ֔*zLD7WcD3< weAzC1/k/8{mPrQ*`TQ~CJBxC#|A峓4|vYSIR>zhYzgnlӃ+pئ4*5CM]ئé8۴p#l\#]C\#h/UsXjitica^8+x˙vN7$VNJHYбŸ~ pF3Z6$q!˲Dq l@0Dcu#P.b2,PE AB+Mp\eP\ t/HR!RcR󍣍 \yppH0j^s@ʵդ6o8w 5+ShmBC2f<| lwMאY+&rX +dh]@gtGpJ"~Ifc bm x@>Dh}WމhM):tEjleߣw trR :8whw+a!߸fGƴzd˚GtY#.)} mIP(i{.HJ'|JE=LY6@琭 /U;+r3; )eH>eDRگ2r֖Evfv=dq))SpGJ8{=wmt+Yf/.3$DAoIV7W$@& 3cr#r {PR`:T Qp';u58孛 `ψ_Jځ6 m[ \9Po'v̙Zcx-$_2{=yx! d1t-y-sS<%bO@Ț-;\+@ TJȕ\MNx${6)$0a `I39/-Q9!FHUFySG((c"i$qCOiV5Ơl,WL8Ҹw1Jt+Kl_0\W$"G:9-r'D*W &NE (B>Ĩtۭ.I9*([烻>,[g8#M2m*xOSRwa#ژ0yYm>A=U CDM.Va~r /ѽY 6X!֋ MˆУ0|# PͻXm׼!ACN+tiX`S ÿ o-<#&zObbg~NXbn 2QNTIso.tE(_i$Q4Gq*y {gaRaoe_N[>?ϧӛwL(/()!f|"#>s"_^u\x< pd~73&aPw5_egL/tqlX8*,O0!d M*ą= +BŇ0uTMN&a@Z<2~)t5ϋt] {vdЦD9{#;.'4DvMɮˏa]LP%06dȊ ?Ur& BKs/iuqܚw~w n})^Z*ݤD']z %+ʘ|;;Bߙ}x\S#wͮm?dfbo9BkWg'n?~ [-Ÿ)0j juV߃N{oHMB'L'L:f?Vx 혔ߎ!D9}惎vGvNp<QdU=ɕ,3ްذI s%H5[^n,ͳ6Ϣ"оKe]JYyˮTkJ WvwΟ^_NsIQ,l'*+ :XcIob)v1W.7:~/wO!\,24+PNY)BT "ys Sj {&:T,l9=x^2ýW$/SRް:$NhbN.;WvvZ:IƑ,{EY'G5wPqTŌX|vk? \!?zݟ |d|[9u5}v5a;)Vn<00[c'c<ppA672WDV TըR}g(\sIP@Y5} YAô12ky\0Q(&#b"0C%Ҟb+=`2YĎ{YCX렴AZp5U"D+-1h "(57Շh #{zj}i(kR\%fN,/<=IF2o?}Ҏ/!ed!7#Ňts] Tq\λ˖R׈I.M.߹%BudhlԮn:V+s 8rA^d#]OP]*N I`!t!IXk*֔p;{YuU/XhLϭsr\R,*GNx>anf4^t5O?|(79"Uc njhRl A3-ҝ1WA(4_X Z皠UK2gEᕸ z6Mg+()$)Br hVOs0fe,KCiRPtjȾݖl7 pK4)HqNG }h 6gb*qx>go?Yоx$H<7$Jw/—%QK$ԩz-\>c+}eՐB(Te.S6^is"B-=tɞ$ K]cn_ysd313 QDof;:T>bH9=zrZG>UI \$>*P%`*4VFnxƛ|21;CGg "%;ξfgΏY6/Di>E|GĞ|Y +!4Ɛ[LIsR D פۨ}md!Ky<l`Bw` Al$N[a(3{1Dۖ^zj]GE4p ?Q $%g˸f?F @0LLJj7M?˿j<6N쭽8ߝf&7lt S0|] qAos2DCt#/bXI`hj' (]Gbon$v:Fc5V 'gŻA*x`NO Ց,kC7:oZn+@CVtk3}d5m+l=Hee- OSκW38R UZ vj(;y>{}3<{mߋb=H7?ϣVǙQ8y%yFKnn_l=˾-{ոΘܛǣl)Jlnog^b{뇭lee?fbV/ؽi(T BVH{T W9~c kl#+/cYٶ^5>/`ܕ.鹣RG'Y=-ί$˼4}Eey*܀Ց>C(BʔgWUu +hZ+}{WF Smv~ GY뿿Y{ |[_"5pymZ_Ltq/u֫_ρ_W>fhW!W.5>1 ڪ~çP\o'~)s) ` 5V@lR,>AWMh*7X|zޝ6{ejCqUQ"ͮϯBb%V=M;EӠ99^2H??`lQp( 1>>t0 xOLn5,3O%Q`%v={Zv2o׉n+gj_u\H!&AJu)q2x,@P`nbu^7nܨ*'tnkq/1j9"(2 #,)G׉ٙ1띘%%S9rbV>f*Ƶ_C?.˧/r6ZrqW9=Kl  B$FL*f V!<؜d@,ŀq1Ie*Aˁ%\ḼUNs&ʼnA@47慉{8YO4BhGAc˔uX+Ǽ[e;ЈS$h=X 1ɩ\W[I%Ft5I  Τ`jxL38j^FDX=@-|6{h8kL@  gE!Ou1jSUTޙZ$WK%cr'q$J'6zlIpȂ bƼ^a@&DJ7q׋a9f ꥱp H I S*8ZXe=&^$]I&Q.` #$t&5! 8r iMihDD(40-Ts69((0MT Q!P$!nEj$0.Jð,d+Yc- 5.ATfo0*텷]h#@ 1p*$K.)3,!C/J(Zý\!m1bFm'`Ơt RIJ5V#ʍv)[q>@sǛ#VWD舖VDUHtD"zÑd5ӽD V&*GiP9RnS,X3bsQ-:&{L'0@*ǔ1ZGeP5OS&U17XqtLjF"OC'/%`X̙8בKJH! qSPu)/ٻ6%Wf10HbvB?-%Ő"} )Rqf(rwbawS"/ ѷ?X )=9U Ә[]4^=iz+bH f)^){>(݆B/)f $L a9a!͛Ź sC$)ᗯB//X{H$ L.y@<0 0UZh%kZ5̏g[\sĸKB/  s,-OEy +pB/Ը`C#WČ!Hds"0UZhպ7o?q6l ^+ \W~\ZE>d( R n[~B{Zt0+VhRY괼_RjBG =3)Rs- Bzmا.69ØG͕Hc LqY~p Szsq Qla C" ]hiiC蝐IV(%8Q8c*8BU [R[zjgƑR}6JZIcK50Z),l(CP TX;Fg"5zeYW [/]krœU\YYȺ&6A;ȥSA^M>rJ)!MC>Ԇy^J3rtFu^*x.段r{ \PzMolݣ \_] WmAp%OOYUUzQ_Ǻae7/wϬ@u6i _e3Jd"lQm.@Qvs971*b.N` iHȂQKZĬ1'H *R橈BP\,=v -A2~k#+#>Z+ q'h34<wjtz~6ZJ%xB]ko"8+_P&/rI\ n~a%Ū R:\i1$JKX;ERzp (t#r͝H )BsNz hmRXq '0Vw8SzlOatcoB& XJy+`cdQclaԸoϿj RRq rwQZofXd/hxyyr9r~F9ctΘdsC"Y`"WH ihbmԿ\%(}҅+i"y~A'?U%q G13E^7YeZDtbuKk8;r2:I>Q2)5,ǤKNr9!܆xõ|y{MU5Ⱦ}2&*w`b)uH/@6Z*vQp)JL3JM q2!$cR!$ ֕y',\^N.R/R+kՃ|I)2f1+0tsc{9Ţ0҈z$baF@iN_гJmXAg6Oc8p;c8: i0{zJ;pԌc8zc8p4հ}f8sptP Gc85Ñ/1u GQh|uc0-3"t[ˈx hu#cq\@{6|k:h\Ѧ˚ڠչH&0UV^P9#\5{AչK04gCygٳz1{tٳO϶l! 2tĻwub[ו_jئbaX2K}ey޿NA>}qi{["O+lۦl8eWmsYYY LbSq0|HcuG^+ֹb+ֹb+ s:W8|5'X3esf e.'-tg`+7 &jHzlb'&-8HגZNi.%E4ָ[X6@hBt -4J!1Cmھ̇kIV=sʾUxoG}JV"sLBf .$'JWWJIJk.Q.UMv€F*WSa@Y~w( #y'וj𼾾gcX_hA Wwme5lE.J&&0^O"WptnQ!RLjNJdIIV|(%Tkڵ\fւ;8Vf/uvux)WLd_{KdN ~,0y0)+.]5 LE0UJ ϳ=q׮}r֪[i~s_m)ʁ ̎ b±JCҌpʞī.7L}lk OrTKWzz}Ftܰ]>V}8zMz1A\ca\c&]z/}t>zzϠK>D<ڃzNs) m T{V @Rv9 5/8DC25Se`y#NiX"&ذ:抈殉&__rS귿 iUY}.DEykhI9S%69x.7QȈ5 Hy0Hh%c8 H$s8%QA \FGAN,5Z1F,8Μ6Hurd;vԢ"!%n<9mRSa)w15-3qf<;31Sds]BZj*s(M’FAzPg1R8yDs0;@p+)eHAcjs0 oǤG03IXT >2(kw cORV|yYwk" (hn-9La ȽY_lmVνP຤p s @E B#A@8qA ̽sBp[&NW{M{ A~k9M <{Vy>ZhVpU6)\_ ["p^nxn/x]p5ߝfqu+\ 5}3 mva8OJb1WzܺJp?#~\[Egn{ *-.A,\VR:c r%Gn} Dދ4[kxq{ 4T=Uu؇p]2֜Ei<*XG9rQ@Hhm1ilNC!O) 9&:ʐYqߵgȣI!yzF#٪|*=gLE{Xsk̓(ج3J6ڬd'TS*TdmɭHwj1Jh g.s|Ӈ093éHh|JA-cטHc"{JC, xQSD (X;Sʪ@ $E(qԩ3Ah|AjH*TIk JMCr}D %kT)=*ۋ:`izLTsT@ն D;|Ky<h ,J!#P㢏`;8McC̑ }cb9 bYM"IGƖ)) ["PkAhƙpLR92੎ v:"0V ۂpM T# z9%w}0OR [GdB3GT9i `qZHM4B1 vBa&{R6r+u0Qh@J`/# ۂV`H$P?DS*"Zv@d"MA@i.Ri[Y+P*Zh Ъ}CK>.j+RQU[ o-(z\\\ 1U[z\\]-dbmUZqqZh.&2k@7ZJX;K"%ene ̫21r o0;00 rf v00m&`$`n\H,jEN칍4&ͮr㥉x7J f5n6s#h/l[M\U>OS%?3Yt3yj;9Eݜɢ^(5Er9Q^kّ[ّ+i %w/ԫoO%?Kr"5}Rdsr-U4ƀNۊIqiEiױVh&WCDM4e D)B ڮtR_(]x-ñ TH (ոӢU(6"$'E{j@,ԟpeyG+ ̗ `YSẘrofLGS e5ł A".f$j= o,#H%/11)쥩r mƳuN,:Y{nlF T4cʦ6]P$0ěqT7wV}{] nkd vQRmW^g ܰ4NxT[~]ԯb=3<3<3?5Cְ_ü''']om7JӢb?[xX=]gX} ?yE<q ''wI,cp-ړsDgq9JcŽo%/ B#Gs͵p8P#^lpk/,BFe\6*'1*?QLK>+PiRNJC4Wq *^~[%? Q"3b96qꨏ/_Oھ-4+!faM94DilĞ{P`@RdtL9E7 =z*5 ;GUs0@()0aMCN)ZcDu 7~ћ\.?ңZ7Z!Z(-#MGoD՟:괷T=Y^ܓ14њa=7^Q}PHfG:{Aչ\L[A59daY]*TiTu*eŶ ] U}E; WTEB1e6b&; 4H_y?D')>vΝACN /ɚo8N% ۝= CgZk /c"X$EcT $MvF(0om Ԯo%(9/4W}?r?ˬiB&uoMs6c1#`B0D$ MdzGb},f.2[EX4`_ b[U:_ Vb۟wY2(Y{tjJ~F9 @HE'Ο 38`%u{5^A ^*P]6˖[wof[+.?5Uij8 ZwerDTo<^:FR+60wYbpq&%jKA} }8ҝ.}~#ˑeD`&I* ą˃#܃PHノ0Em5($< #X,(EN#6\"_?X7v MJ=}Gx$R#*$kǨf ‚K  `,A!YW]|sI4Ӈ` u$\aV^[_@:SD.h#jWw}`0%<~(V;סW %'^oAo۪KkvPj1YAѩs[G/S:i۷^6mbčaZDlL64@쒋B_vHVjNnͯ)5 )i0fA(^:4 ׯ3d%hِol^5G5n5#~DeIs$ kI@uc(MIG_0?K3,rd/T |oqnR齹x?vJ7^T R%r^#XW:HUvn&ӭ1hFa5%snnu-qw7(a!4aC Nj򋁿JNmh&8RKK/&^cZ -|MJpk}ĂhbZͣ51ߡ= o Q/c2ZfZaM7 )gB:2*ݹ Zrt|o[Q!T;D rxꎊo4g cIru *cJ K|\AȔ"F`@hlj5.J8&2`f2 #s0`ɿ@?4r|Ë_UтM*|7W:qsP5ͰU] FEP #׿'0vU" #ќ$Be@ň3t{*D\!A5k/s%baoW$>vu p. K1oXWk:[?+i" &P.B!9tS SaIrX*߮& b/Ƣq~w,^7mqU6BrvNmw#՚%&n=W=fNZqjoRy̧ި,F]F*T? }+*Z,cIcfi{%ga$nvm/!@9)T }NN5?ceZ+֮w-F@)Ii0չIXVs,:H\$dK6I) IZqDuE]T^"9.GX"HtL@H%aaKs{ļ707Iպ6>gɻ,ڻ}-ϸY֛`l +s|'u"761Q9 #K0s9H{j <H{n(SCǩu}VFMt ~uUBv958h4RgQS  z+5(:Ռ]$<\ڐ;NӵȦ['iK%taG7E; ҁvNff1)1%2\E}/ϲ>sv0u8z{{LFW3uH<#G34(rBW2OԖG$WU3K]yFjyFl|7W9UAvY:ng7IС9tK[L#%lբbWa`iDwl!\_=,.SNM@HyG+mN"a֪|\b8Cd㋑.a=mϣ3C`pRP#`Ok8?ѓh2pJt$1ħVg_L?}2:r曦__<ߛ'vl*[\kPԐ_|nA6ыנG/Oi`SwH*q \loWnQhf z٧Psw~u l{E,~I h,SiI$(_{.f*փi1yQo-nNT;ۛ>~:v٫tl"ҍ&=у/JNzK$Ӕo*'q}rz7G'EM:?˂MǨ r)%o\vُ'?^>y}9ݩoR.+o7i~oLֺf/;/˾&cϿZroL7GGO^/>楟7K+5ވ6;15d\ׂ~{yga"g\iBntdV ~77{e䲗)W`.O5Ԛ(;+q~zR8 `|ot7&]eF~tj%)m'Gn< XNd{iW?/o]0קw _8gnh_ OS_ |> ԇQ*{zS*?) ~U`~az6'i5{_8+_<<6} ] ?%b8P ow= cbx-O2=MR~lԣgO»0y|w7]};OϞ>~&/7Pcs1~ë_R0<[yd}e7//.ia䷇.J^w5އy/|x0j0Z)lba_ֶ)MNMZ7z1kןt|EO?k4Lw%?Nx4_ū7?9| >in4NWG_;mՃ^"tկ&^8bSʽ_]Oc}Yٓ?ӿPTP審n&3QF7 AbM^/m]6FÿfO)ZF߫2,눦^s'Bk0 aZ"S xȴDq|Uy&*u?Y\k tGY@Ct-m՚ʓ=hG;E;}C{О<졝,ԋ(ߣv'nfՑ]'vl k蚭#.118fO~CeW6'/[]ɪeC0OS֐bJ*eWUxe(I]E- UBZ?7*xFō IBd,Meߟ5xoo4\epz?/+wtP/lFMb6 QyH& Je  dje 9( 'As3O\NQ@B2Ls|1Ê&`m:!w|?H* =ibdɋ۸jrܮH0n|e 2-2]յiJЙEeP):޷XR͏X9XZ-oV>Fݾym$_cv1;Ř٤R_1fӇbN1fc1,ԋRV l0{1[c֡kc=Ƭ)c1Ƭ0lIO1fC٢)1fu>ŘbN1f7q0Be?bf75ԐrSCM 41[UGcf0E:5ҩ/JL~ ׽l;qݯtܠrC1W!+#&1_v}AkOTI)cd2A1F4%cʳaAr.rX46Ьtecz):7^(հL0ok0e ZkH a6YJsm*J&J\r,&Eߚ2ZrT—/SZ=zwR TsCh4+mXLF,FEs։evYrPFrğS lq1lx%u?xzr-Z9㫷ZZ0ծ5IZ8ru*W7z eGao6u0Sʰ#W~ClXƼ5櫎?ͣܲ~u)FUu-x{EVSu4,TSgK3ܺlEZ3>@IdaKzv6S&o{TmբwFq?KSE}bcлuY=mYIgl'kc4kiG^V|>$Ay}[fO})\ߍ~08[f ni7O9{rI(NuE]m2߶E0$-\qvI7-zkr/鯻bo3)mtZo`D m@CPuze; gXS7*Dz8*Hܺ?TXTcgWm zXצҮގ^ 2*82S"Pc4kqdɶ$t9}VҾZך[>)i5l/ޟ&#?0_}Y>rA5B2'$Ny0R ) 3u!9n!kAlm;Aga˂,(/npin0r9ë|>HIH[nn?Ki_q mٸLW7Z$֮\$֦Dۘ2jU, σ,;Rjk)jjo! ䷓7\[B%D*+Ϟ8 Q̭[q/Tgi0eF[ M%t?(DBmO̬zXrژdCC 9RJ%8 2$Ÿ }k5cw$>[zy~kEo5lj-G@?tRX' ;Zv!Ep9h FRGg24]J? G(;è SDsQGV(a}^R;]1C-<&E+N~ C -Wdpd5mdH2;!c8Dz0ĕ`f 9C܂"sH< s֌۹W@-r2=pߡ r{+F+m6N$'V8* s1'"9x@m'}8A2|fQE '*M9PδMWΌ&ew9&q.5BmƱY0b5e>jgx!Qim #>'[t~ |?M.,jg-.1\63nq'IpK"MG,dT`*%"(osvB?,sQ(Pq&bK A_^N>H"yl{WMy¶^+h^9T,xp Qo޾!`=_3K~8艮Gw#sM<K_CQE?< E?:E\PC+k\cwEv߯Žh{cW..da6>Wh >qv~->qZKM{;H>ڟk+]˱ߓ_u pb?0hFO'8Q7*7o7CD[q4DPJϿ)>K=ot/>NndIi#@R**H@a5㘸6{E2TGE9sɕWIr̸rCI.Hf(0խ [9ToV5-p*; Xș$KSPW8Y *Ji,VƩƇ13YJGXSw"v0 `ELV%\[CIBrP*p$&p4qœMasK5-!YY`&8Th|B &Cj +KdK2m22$8H|8Lྊh/[؊iȜy`L$)qɢ "dC68v$缇7 ͂D.ц-H 6ʪ+íAmQm!NZ2c&Ro-Qv"y@Q•@'ֱ@2;^ ҖYdgUn@sh*ajI ?? (JF>L@Ո~J_ng^zBxϦn 挼a3iJȗ7} ЬV*f&H u`DX'hMj .2Y`<񓘔Jmӆ(M/A:')kV-5M^FrTDJZ&"i'#I <YC}s!9襴Ts۲A) 'zJC?^Uwmgȵ4a27yK9{hhXgDu@\Rq5`QjA򴍳O۱tp5j] +Fj[o[B Sh+IsR&XgVk֚ hVUly m3]TFf-ʍ QC.]75vejg5_"B1" G}syȮ )G5?]e[cBAAviZǒ*\t7@B/km#W.^,bݳO9xM`Ka}`;gxݳ'nz5f٢jgW;oR [BnJp6,=u!A7BU}D('&ž dGc'd""KEMܩ+'h^Ԇ0% H!"Sa"E3@@qBf`N?Zewol!Ӛ3=xy!ʞ[ 6w=xc_:Y~3Ȝ-Gՠ-Ѫp ΃ t[eö{HH( sB&YLNiH"#-] XE̘aZdHm)[cQ0 $e%]X!QͳF1igr-a5i磧l{Ѫ~П >XF{SudbFEVJɘ A,Uk)cs0L!*gQ\i3d YyʐR0"vsb,שMg@xvF<#^nD|v *10_Srb|ƻ<].o吚fT@.E0/r;-DTvA-xKW{]%amְ~ka^niŁݧn_LiK%: 5kBicZdDŽp7׳L[_ .qq5g^͒#sԢQceL_86&O KFe;XaԶZKzVQGMmǤT#*Uva4 n8߾+qd\2_^@ޫO!nnHR*Ru{qswase1$>_LFG"UO  O0ҿ^p.rNij!SGJ*jЂxFlM|rJfHNc(%)sH.ЂxZK4D)'2ZtjJ; 4FV&9DPň'EJXZ/(卉$<TZ* @(w!34!^j2XAx@u,T3NZt2\lmbk+Lj暛Ui5IMeX{MѰCsBs1X N2 Bz+~;Cfh@<`ZKhѱIz"G;oA.ЀxNj[Y,Y^%XAj\J!34!^IۘxgzY ˀZ\?ds[f`[?S֘3{#ɲ BX6P:]ik(6<.K*e╰$GCR١CfhAZ ZڊHTZJ0![[|UJ"'JUl:@ _iE"cpw%k _.IELi.p.kVn,Jg,R&(y'S2B֕䒧彘GKB7b,/wL.MX%.ͫՄ̲Z~~m4\Et ɧ5Y7e:Gn<:Ǻ/UTɬ[=@BCUJ́R&egN1E첵։(N^=!<(T֖F=OQIWc:R`/|\0lT`aIN4VR` $ 6&u9F44,-͒DŽbTKrF5RE.Eȅh0Ih ATA9c;M ੉6jnb*=$ )db9Qc(ЅY Xй  WrMEǩUJYOx/hQKkX̑x+* 5kB SڀY-n^mM(nUJnŭj V-ML74f戵)qkVio;2Ϡɝb(,s%ӢqXL,d6#W$FHٸ9aOsĭkW8x6ZuzF7Qx]zFM6_.=@|ʁ$#3o(D4o8dЎqԺ3jY•M4U -ON/~')hhFn9Uk_J5Nӽ{VVkcx8.#GNh9W!H$m#aye!ȍ@:vvPgtqjQ 䲔?ώ~y]?o3!4o_Z;pιqbxG7~w<;oJIKCH , yV" x3%ߋxNǽڟ(BŕzjtK<["ږ^,6g7iѼ NQ5@C]z9X^ 5o`m`{P-Y# 2%+M[lcZ-+h FHbgҥ)fp#6Tm7Lmp ۋ wţ U4[Y"IJ -='Vr*f=׿| /sQ f4.w0Z2ٴ'[f5!YpA{¬R!9j;^[}&nyOZmxjq({Ҭ5it> 1 }}`rp5k7 4N.hkNϢN(!'ѻ8|o܇d~mњӫw,ޙ= Q 8eN?ȏ# 8e-OϞ Z6ۓTn%d#rrȼO'op]>zrM[B{I"&hfD}.ڑ- 셋R]Z.Viؗ|}Yܾ,} ۀ to_HW7pwUpYr*IjT ;w.rwSleP2ƅGILhNƁ/`OO*UgRyufM*l /_;h.B(pa58Lj\*hhA A~5']Ëߕ̼H6_<\0ϕk,Χ $h)ԕNhmԊlAQ/Q,T-y-y1n]a7JDלh*5g0 ,sg$j<-j%x;Q/gqJ&P5K\DQx"ՊYNe0zI%Z NF18k(ڜv;Zu`b7'w= yJS7D3օ ][ |cA8F *2 QQZ%5&-+LyGIhd3ih9Ę/@h|QjGWɒ(JK0s*S~so狚F$\d lу8: Y!O0KT[74ȧIJnrӻK6D#<21LS`fST M]jwHYI*|A 0TzkzZ> i^`uJH u1[)N0bdגv<J.J;=>S'~`j_swWlXTܗvn"FFhg7xIi \Knjh-:Kx7oZhe,x%³ÝNI.P"0tI|ltCze/<ܼwCQ3SxoǠ_4:օT@hÚ?`R>.GX|7ǐa~v5]!by^( stW'"95=5hq!S 006 DLP&͇ xJ5}D_`N8vs˰-)U趈;H.]йj*@8Y?])`yːJ@k11 ="a"IcHbxK-@ІXNQ*ek{]n*7sZGxZ z"]w_C2tJ7P z){8>No{=°p|v8=;5J/F{WVBB2aG=SjJk4mHʱ}m& %s-$ޘH^CCJuNyڑtk=F+~l 0a!a'wU^ :o{٦K:4x icjfs4{ְ-Ys2S2753ij;iف"gEF=k\ˤzV<y6crIn^r:Y[mQk'ߪVDxfϔ T|<- fx.~ottsC *.ArS:206\(M\PȿXO_"&}2~ĈdDyZtu7ŊOR`QPDD7!ZA Γy/R0p$ QP>s:2fk~eމpL$B:TYj+AMbPb#Y 5W(ݭ!WU{ %$!)Jhz'A5CyCA6|q|/F#'p쁎}L,it\&x='stA dӱ<~|opR 8a-)[NO%7IAO;S}ӽØ=xʭUdž3RUIdfVɖ I$`|Xb}xĚIw~&%sv|c!l-ւ-c?r2/Y۽GYyml2=I!H!1h_EAЅEtXʫ@9qPL*{"yCb")y{Džj|eyfxיF;8\!XP={z4*h8{ӍK#4vF0&QzgvI DFœgo F*& Jz-Tz4#[֣8IX1=M ̈́Cz*-N1y}zȱBG{kV:ٝ@MMr<$]?'mAG]!wɡ{wwI6~w`>,Ӌ ߼oqRDC7V/] L%jxg>͛ƶqu!xA>^4y_~~)sy!r-ҩMкb:Ǻ\#6u+huCC*ZS@.ӽrTyg-,d%^p"}sbšD'z y|$8O+E._B<5시{uZj 1rYt0GRN=ܛt`Rˢמ7c:P1)^,BNOEUm~|}qu+$YuO>`a>OtÖRhuːFN"lTPHpe48+)s~WJ1FZ6h0 ?yHH۔wGc6ȐBP*ZU4ݮ7S3L(Ox=݂f`Sᮻoue2{LA%/h>m z>>tXτg7{2ݯ?>?|$lz |kezwapdKLdKeya6`V%EeLYܗ)0%=pLb(a؀S؝{0 VL+Լ&6:mAAtD-!:Ւ`X uKM .F^2_&v6;|{ܥ|nsApK&Q̏שCe;V?d,`-, #Sn ^ @G{(TL1S1֪aτ soEDZ!Є0b<XN:² [T&84Ex E@5 vO*,(ǻKi},+K;%j2;E)CB5AKuoݴb]lhsy@UH * ᢢY<ѭv|c弣5/Yrh+?ʇ<˾zh|)1 ZЎQB鶾&2_8ZJB:CWCVc$H0 |m @B!r;[!!B BVqݝMѷۻ1~*y{pvw %ڴ}WX'{l6GTwc>ɦ`Vrrӝw{=47*(?MWNn?³W}Jx7OaDMI WcA)Q`B]jﵤSAhsϻܾDʾzTӄ߻Nw\ܾl2/)tz6Jxkཉ&^X-om20JߦX7Kxw /)d^_)TaSU8SblC1D&{6MLtToṋ_62 ɇq2/ܞbźLgw5W{]?{OɬbbJɑ"2X,W2ZZGr e\(cp4bSDg֗ϓ*[eޤr$Rф7Th txT Atk3:p4w=\Os1BZ*%V-b19jbb!Uj8Lj#?"8b}3Jlˢ&+Ȩ!Ł8pFM/zgTQRw_#ukVD9#u7Ə7q$ۏueIng̢۶ͫ֔ }R~9}j\* 0ύ9Qq`eQjc;ǔN/BrQ4s ֵ"+LVPfl\ Ui\t 0 ^3@qpDIʰ%Zqˍc.X:n:r`EI> hq KyaT ȈE a]T ,j`-ĒYvh=< mbX4i &\jc` So+4&TP_ NT\y#[)V 1 pTD6ֱ0ENQ+}pYT0#Qÿ㨹2rL EPN=yu1^v&btXu󇥞ζv2 ֵu6 o Qj+e2#kVjKS΄t3rT1xA;TDhB]JFݘ9`4U4,_#h.Aǀipy\c-D+$dQ- Cё3xta&Q}6DYΧsf;c{}ߜ AS7{.XŴU [V<ҩvt9;7q`Rma&g`X:wsNH.yf;&n\An}%^ ZH̲FqSi42 ؉lD{8y[JRr:\tv8njī~U+KV&SާWl#yљ1<_5qF{V| hڨ_?'W"buR#RkL( 0 3g y6i *ךEqW?|\X S;JEV9-A<:R* :N ,08&k1 `eB] uq>]MDXNqe^ Bvn:H(p5ER@nvIѬ?`1jPVW1RGC4J%W/rޙz֬zwGװxz5O/1:#X3,хR4dOL.Џn<дCGN&>XW&F7[J7}OQAXjk_-.#:Bj_B#G+pP#y]8f}~{d砽 9ab4?z55o, yG 慐SΎМt$8n,sVw2{V|eu.{P!=%+¬d2oY l$Ύ【(xE{E֣OHk) tvϳ%ڴ}ލHT Ad>uV t=#լX8S>U1F\ŢTT4+pdgmOoNPÝ$[CaHѸ+9@&ܸz,G5ש:If,Yc(ǎTLE0T`0syRPի -ί]լ;uEbLgg]_B + FR /$uX%_<5/@9 #DZ y 2uȂzxsv+_cS5!IJPφz /bFm;H(;8XGb!;w=Cj3?'y n?%ŏLfmv1OR[!Ħ8񘮻#}L״_[11aX;y:0iGaCOvp[х#K"ٮY;.y0."w|G Nv{ВK#q:ǚaBkUywr|`|"ǾNh|#_1FRIr._9g$$]~X}WR).J,c iK\rY7L1²"(F`0sX8vMw! ƥTZU\ H6іXmZVHⰾpJOCskd߭g ,;>e`-Vɖ"VuMZ"s%3PSb PnK9ƢJ' fJf7n\k¡^rG"0p lհF3Ӕn,-1i[RB!Z-@#CeÑBrIY3R"EK0/ D)3KV؄!4RMmUCPJaYU7B ;ÍM /E"@f i9:tՉC󥂽`]qˀ9ҕHs(D ȖH1D4G Ĝ7Gi<;QUtEN\[Wen k⍙~1HŜ+fL]UDY +fFW^h-+K{H=n(obԂAG#ꢩ0*ի075si%K9qAVQ$ѵ6R!xU1l`eEye5"Tjۋ7`NKQDdfW͙/t{bj#vz;+rs_=QSYKkj˅mM M#Kۀ+ja섂EecumKSoiNn<0A2xLxmFKހY}alnSHrMvkj#'E7.]Svo|6ӧfQ_PPx%׿{i@׊2auPv/{T[46q0 鰏Koyv(z.R&%|'QVx)})%ȳ}ii_g._ | oO:>tҮAACbL[>m?ա?vZUZҊoVćmΟZ`n[cZXx]-6ⷼ1( 8xʳ?"oO_3MU̱ ~_?o>fowģl{ԟ#BZlw…o{S0]>oO]nM,pӶCjom#Uxtم`0gw6vc>=wlt)!fox)࡝Œp 4ώ<iɜT_930'V@br 7?n+NG*!UC?Qw'0+F2]w;ԓ}?D, =l>buB +vo]ݚsP>auBdN-%%˒eY޺9!XF@Z soi%~ϒ#"Uwg{,΁C"HC-wto95?eAP]'/wDZpt#C(. E'[s_# 6 9F8#ਲ਼gBʪNP od 9czI w' N5,z]\rDξSítѓp Vܻ0gD|XAC=n`9J,ACIEE!ɓࢂÃ?8!)-4%= 6Ւ.:\ϽJG ٹq,^eyk=ߍ *0͜=Ok7G8i:!"B4Ύ*ƌ0te@WN&׷6FdVXmU.9IvnSRbsW]w+a K299{Pɍl?#Ku4<7wa sFs K&,nLqW4כL /d0,IZ_F$ l yc+͞nesN/i{ &W?;E& jN< Z8'<8Jz:tw2hu2|l9)}0*8z h*鋹Äs!F/$&_!u0DU"MTL:1|T[0\XܽG InyQpW>cjv/Zh}ek[-ku.0 Wԃ3w?U`p,U63\ LZmjݽU7ήo7"+P#[ayX Xךb|n_Oa@yZLiI:yvz[t M[z{eCԗ(ևM#.Dp斬˷ǏQҹ)!+r i}δ]<ߍĴV8ќ;+9=WS:A,6 Dr09f$iZD`t`g|>jhXH'R'RrēaM od y;ƷCO|o+gm9'X.5~4 !Ʉ Φ) 8'G,nb uޛ+Fc9|)#ҚNH_g&Q!>C[l<E5S,j* IhnQ_THJ-CQGfPҴ06AƣDcTL8 0qf81pqcԡKϯyz|Htãx|TLs$K鋇PF?dvۻEOqvK;n" ڏn8w|7& X*̐P+Q&&"IthB2E,8/&yVAJ&ԬB:~*ޗ ^G$'>iFjO| 9,?ŝݔ[GzE_1}IcEsJ .!7H9c+}}Mz\/')&Xrd=>bF)I97uN! w*5zA˗tjcLKDD(%iLդ\)j@GQyRgTp$NޤMَ_W9QCxhc;;Od BG=wzC{{?˒y^z^[N'FE׉e 6Q! Q55ZQ1^7Q && 7* *5uݝ%M(`dE) ݭEqQmSÞ-QggRt& % (fԋIR9afބt :9E2u-=̌ҧiJ|˗šs^6LW j$:5Da#q"bX{r^,߷5Qɛr0[?_by| 7LF`.)?HJrk7۩x;w~4?~woh(5hk?}3=wmZ}ܔY{7-M丒 !FkZZlk%5ĔXiUZ$^@g\jG{|f?7s/BTL6J5%UI A*eUT 1jpÍ Ixu蚹R 1zk}ba?=%SP'5=+HP`J47˺&bTѦ2+xVHƌVZ.K-LkDQ7wgBvs"ZHn2З zXrKq_nA=JZIn?Ȓ[j{g0dMO5#d=H*Ȅ\/nJʧ{JGQw!CY|ls3Eh˚W0dU@8Ñ CO7W>*}l\h659^"^P9[RBVg2Na&r6v^Q#\~aְ57iUwrs jK>b:kΔs6H;(P":IxSbO($ bކ d yŸ4UAkA"%OhH B `4Zb*o|Ο-û˫a}!9DraBXor-j<ɵ{%byLջ7U(?63oU3ޢ cU=6 ޸- !lkMǶlo Zī?Jclrbv*/aiŏ*wffF~|_oFrAj=-xD*?Mk9ٓX=٠&!l:j:xO5ozP߽?w @ל?k^Q"IP5 CڿucI–[IuJ}画xvde.V*y@BaPb*Q_:Zϭ$ka׍mz^C+tsIx,0)JV c˶,FPi+JԘTVL,Kvl̓ښ"7na{K4l15'T̻ ڠ'H(8^'Q0QTt:DǓR]LcމL[]9 &МA]( lpIYW'U‰d32}ixFC֌ Z KhF㗙ꄛT#om~Wgw.b~y+=u/TSu2-}U.ALߓD#ˀ9ɫTk9^)q^+&?\rW3c5|1N]<._\K?+&lBӵ ϛxq>;ԸsJcV_DGCrm)hpR֭*i|Qt,S YVF[ UI&ہGY7'f)# 2F5%(udTP:壈* T;5mGTc$yϖFSt,CecLZir`}(2(HC@a]@RRΜ>x<xa)JVMxx׎9<+ÜD՚Ĥ7V~jMZ1q8&Z9&$uhh|*ڢSՂ+$g cG닐ElR" vHIK(Eko LBj2"4 T[[su]} NWɦo ".Ӌ47:JsQR4BS(!^rWR"vp' "I+l| C.vHJb孽\S B*y'H t:@\N{gq}AMROhŠ&k}1kZr[hfvCCrm)#}qBw_6ܑ9mRn{JCej淽Ua!_6܍Fg&Fo2TҸzi(%dNc(NNa!_6锛{^ڣm>`y, 4'EWN.SP/x%6^`&Z/JfX$X-IZ18Ɠpғ>K{f|*ڤSHv=ukŠFuZqu;ՠqhfֺА\Et fDhe).N)VHI2ӥ OLSc,n)'+gSh6T;eq ic<TNI$=4+W&Ҵyк҈A{;rB?Ь'ACrm)s ERJB'$FY dR7 T;;9{ms9Y ^%LgAeN荱Z"L WԴ>ç?__,ӏuL~ܺO O#;)Adp AL % :O^x<i싚^|zwC"ƿ߳=^voJǿwO/|Wo~ޔvg=py޴<,R(<ɧV3BYy͗W9L[,=zl_s&y}vԈ[` eIfLX yL2IRa*5_'Qx)fol~uq2arߜ_]RN g^Q/޼Xkɰz=;7W~7??Dϝ|ݻYܒ^dN1_3kFNfbTq4pWpCcBƈRjՕz0m]F^EZ9  j4X Dę5#taW- LrP0T4ikFe{3ؿ/Q\buR]`ӭ1RN?|]w{ǔBOLbCȃ21F`ok8FGQ! OlZ>|]G Ϭxql06߼JbПJ6O3_3B^]'YcPCtD¸'45#3J$+ql5?vm ~oGvbo~ywoOwo슿\^Le~x@=4yzsÉ;ˮY_J(r$^yv1cqxwg5c|wYR/vMRA>Aԏu40v9WïjEe0ھvؒŰ!7scZ,ENչ]g e]ufNѮ ca;Gďw,Ҥe|{k>Muߊ}[꾏T[T'$9aG˃|uݼxbഖ% M"ŃR>/_]ZF/xNtV}V+/mKr3Kԯ'ӕ7j-iuhi|L KoDTo#Dwm@ZW g^:ۑ Qp-?8\{kƯŷ(NQAfܢ(6m6[\3V$E~8%5UuUWw]vq5b_O]7lsu| -:= CA $y"rXiPH^(^/Iq[Zb/E>0| S_0,j{|7K7wXq ג8Bc8Ky h=%ai+ "rb$'_ɚy B2k0GB=OfƑ)9MǨaB/|s- hXd-_LYF`EQ0㌔/]] YJABF$!fX(歶A(||3@`Oc )W$v~7'9^oz΋ 2/}XnGUN48,j$η,ʸ0ɻY/Zw@38 pTAM& ?d]=loނnWoSL[%3%gf>ٺyҋiC|ؤCNrXإ Ȫ/Fny_yl̼k)e9p{CmA`XVHaQdio* V9a|@>HSs[PʇJ5B(p 6 s2}+@)JV"U7Gd#BgZ̤I&5ċARJ./N=k "+4xtͷt9|ggpWoKg46]QkM@ܹ lv)=zڞe|-Kd:/f_~Y|1kh&{5u˶Wg=^Y1ՙ|7]I]ף>k A>v?je@,>¾^~~1ȘAW!b~.̑a҇g\Vc2g@>ѦLvP_>ixλ ^>]ⱒ~N͇#y ATQt9|A;+n`uw2ʍZy(w\zd#p;۸wE+AKI A 3P[Nh+E dpYl>=VOs}BUK$#vkGhf}9xdi #ZnZ*QDGFҍ<6Ӭ&3NOF cBuOQAңFeiVjYe`J(_P{_뎣oEw6M/>bA Pђ9t ŒK21_Mtbb=:U+ˆ/XyڧZa5p a5@Hiq!{48C)G&0^eA[N@R&_3V? A&f5c5ٰT JF{Ԭ1qsX f:aO]q,ҏ`Rmd^>' '<&+(2ۻۛggvzzgU\Ʈxt0oٚn>]%8Z(՛CN@:L8Jי.FY֯a[ Sd٦ޓS(Z|3|"ΖJ"e{RǴ}&I^w}Je+fqO&81L#y8PIZR)Šd4ܡÔ%D 맺/7&wfS;[Lc<>.?oW,Ցx;S[P8t)F=v-ѬZ:L[z푷\T6-ұTGd IR~S€% Kf)uT!PXΕ2G#v:%M8{g}|mݝomW̱NQ !Se|T)ű@C?( aX2Q9i{NZ=D]-R鸳+LV.UlR`=+Cs N _H$<`F+0aJ=2RgQAE+i+K^YQݸ 5k;yR|Y?>:aetE!`V1n,gcjAFdE VSQ^sR m1H2l$X D@Hs aBI^o;_`&nbZ8 Q̔5fjif҂VIa&P֫BiTRɠ/R ]/2itZY.j&R 1lqo[b&P=fra6_ރ9G5ni ZnpK 3nq3-QEO xHJj6lVKE)M< ph[4&wPeg5i_#U29IČ1`YEhwGCpigE㔳vVX:sn{4R_g/R++PXZI1Z/y Z2]ֆ@Uma)( Mls 6 g߿1C`RYJIȨ06B`"DI8"\6 ']%g,·(7 N[K??1&:y|MzԲn=+]w y={VJzcؖHٳ_;gF<ƯN[rtkA4t|)Ln+?86RJX$$!D\a(5lN94$4N(Jz2g 1UfgcbCs ZD{DLT"AXCzQ9JE,d˫ti?f=ᔰ^RL.^pr(k~/ b;߀n2>Ѧ v|LaaCX}H p|-ەX¸UD!{5=}nR:(~76>q o'ժ/Vc*q)Yu%F-FrF_L5rd 1*V!ɢrU旓;>JҾ*"aB.EzzcSo;3[Nh+UrJ]^]uJAhI QfevҲUv2Q;]Q^NyqT<; Ш6+9MOO9MB D"a|}-/m'R.*dɉd*]>^|ti׼ɥMT=ؓA &nz\%3s_5ȏB,0Ƙ>gkdM)ㅳp`޷iFZ"YRcBB=Wq3+#h|b!@wY,#>٦k/>K$!S<+=yՈԟȓ@#PKh> Qru-$qSjN.5_zmA)87WRtK nrC=t"=KJ--qӾuG?hɦui AQ*t_/5r5̡g~5˯E G@BΤRڅ-F wJ5] ܅P"HuaA_DUU ojQ*'Eԙab{>b"X2ytF|)2zTy%,WZ5bbBVN&R2سH`Jb !ڨF@"bic!DiDzJǟNSs-xLҞ;r"AJT.S;TƒpTҺhJHN!#ZrZUJLU)b.%V-Vyơ/3 %#!ִ0uɬ%vvք R7}Zg͸,PV55H 7QNbLq`Hr/­ Z oʧk0k z1TJ*xE<:IH Md -G[nEyTYkƵY̔* u@56 LI= 8 HFB$F$b] jf:o\U)Uk D#}Clk \'0X[ xJgM`D^Glv0&-/>/k額sqZjX$vszxxfJV l%|k2oAz)YMUAݽy0a0<]>lW ?3֭&!XAo0JNf$U z~_LbFәx}KeR:~\G]pRYӻ2N@X y^Jb A\Ɓ$k .Jt1(-1 w0r @!Ro7&ab|jgq{~d7S y5/e1Noiy B7زcG=ڷH,żHOr>jՊ7OY᭱2g VP b B@?;jMܑ"9zo,1I'f1 0b.W+Qd I?g~SH`)x:S( ˻|Y;7韚eNLem=Y>0ƔM.cb-qV`{3wp'me:gU򾓶;s3vZ?fXvmy vqŹn6b~;[A_\4:x/Fcx њu ;zj>I͹P {ߞcoRHHh<%xPr谽"R&N ~׍ޮc#'G5PPclMdFjWCavCzhO2vmP{Oc/ǨZ jY0|-A3AZI=n!6Kg1}ꞷũom ʹÁmAo)K֬qmom6DN[gxec ^iˤJt2PP*Z*SC"N{]뒹͈FZKǧm/SfKo2IdfmcY6n*AEKOL)3+3I15;bo k/!b1%*;tZ2~qB|y r;9[9çlHtz2;s~ 7!,WY ͷ0W/]KO0%FPFnE#,ȭ؄v6g ^gg)p-Yx]fn۠qXEz]UJ{UZaͺ¬`{B:YN٬ζC7b[^|`I *]Zz䧻NڬbVݖv ].>eq෺ <~85 iq6N#ᴸ:Nz8&N> N=ɶZrTkp|딗 tGwǁU: 4`hPr 7jՏ ZjPw~t헫"z{ 1|]Z@T"E8V$*T@fB5ҽws5&.Mg @pu曅{V 믕//@y) ╱T\iceFa斏TΟ4K)0(XtzdlhipNgs&a^gj7Uf)OW"v),=0ƤM.ؠOiUl0[&'9w{JN.>ԣb%"NqSOq+ iܦ@jH6d Uaj^k<3)90D1CAcΕ^FX3 Vh[21BH$83rTR!IGpjâ, SձV܀9`Fr)CZ8nEGsV"u"l5]j'aYêƿ7~Әu}.Zl_Ϛ%)Z'-/Jxn))/¦lcZO&gJ6:9Cry$٥O)b]1}~v2`ȗ:c]-/`ys=WJ櫰n@ڸ_?]ħM`9v+A딎цݎW[IJ2+n}xwN^n)[ r\t6vlTnŊ[]ħHתu}vO"v+AX8FhjSAx[ٱ[]ȧ=vHL"v+A딎цݎL7ahVh.!EK| <ٍ#:v+A딎цݎG”حX\vC>hd٭9S:Fv;;bi:4+n}xw"n#wn)[ r\t6vwIbحX\vC>ғwdO mU-S0UVhS<䃻hO1&#`4 +/Ufz])[ٙi]ȧ$Ρ}vruRNmxS-hVh.!E@L.no q{9ڮфiD)U4<䃻hO &yqL mpVLBQr[sgߛuK\YOS0sWX%i:~J?n:a1.k{A,(FR I6Pyewބ͐rŘu/8yYh_ff럟-ֳ߯Af^O4+|Y`i4e9 %GAiotNW*Zhc@Qmr].`i`i9)*Xf*XppEF !3Zo=vL].Hrsf/  w ŭql W7]^[-ׯ{<Ԗ}1S)o.VZm\9]ː{|W~=^?]\`"?)u07u|s9H a:ԩ s|^͋Qʵ~{LoSs12[m /EPGQBъq+mU ț%Zt>Ta AfгyXg?_zfV-h`S`bUBa+tj4xO%kv]u^@{3_S"V,@jq jh`Ġ W جb kmHЗYH~ȇ2ɮy| ~:I4)K#q|MM9`İѰNuթԣRecm+QAhZ5A, e(rcbgO:!v{ߔiXIgt;P i/tYDߔ; M%甿 9h 1r9 @ xUQ &W)\u%amQK^I@k NpH3q2F1*Lj`6Qc0ڷjլIb u65"PW!Y}]yfY0kFipR'mvq]S]W@C *HlkkXcą-$=}A_H $n*VՠcF!kGB#P,rw7uXRMΫA byHZIΨ:44Y3bLXun^-'M,*ǰ3XZLMMEqZUj-ѵ&gUJdz}1Gt1SH/%:`QltX֛PT| m.3"Gl#1fzt"Np\-P:H :@1,ԩPD1ħL {~}{s~ o9\&Yl\nD1IJ`IJ6oZ,v[Psi5%z +|YHfZo+S9^8@h>fo0r>*q j23VKVP3|UJөbL;PftNIMs*@[άB$vPH$d)i] M LFh>nX~pː$Ouf )DqXt%Ry6ŷjiX֠y.B_^Kz(R޻X¯)Mq>FcH:Z.a4*wm7$Q?&('oYwkbA ጻ]UQu)P m2 mN"8ak5rj@lR#S"y@@3i0?s 3I7>r6#Яk1va7U;tIEe>VUa4m:IB 4FѬAP"q({@} ՠM7Z0#-C.Ǩ[ rSN&2]!90X6ڼ~ z¼~"*LuxT˟O #oo~_4~n ǘ'B$vJO{PDP2nȪ?ʧU;e7!S0>1S%IƘpFy$l]`w t\gj0I3Bj=*-"A3-mJ%! Reҭ] *PzXt$z̴v/(&m_}uIIz T)iuC}.Ep51P5ZstU2~OIk`))J[fhebdR!i܅Jmjnnz4 Pu]*͝SjϯcQ&:cuԀLgwk;>NuX32۩=cJQ>hN |=q.PpTF=H.r,'Eaup92.~ el`Zà0M*h4$1!P .v=Ȥf|z"a|dLYc,F&DOI2CNFk>E5c q"r`MPBLgm좕:%!1C.&y"TazZZ6BO$TD ƨ RvȫVh0JpF2V,MH}bi8\T#.y61tN z+trm=#ڛu,jxm狽-_պ:/wx9{Jq]eۙ-mgƯeokwvU6e /3+y]ihkuf"&2{^z6fp׷j篿s7^FǢ_\Sf2Ri}kj;Ri)I5qP1=/\Kњ/H"6CA '2|Z~|y/.fn F+ʿM[MUXyZDBM/ޥ+`Bm_j7Zޤ0[?|d_o6f/?.R4n[/A7.HxE6Nh[{Og|)?_|ʚ{i.bqgAm 5}W`1w@z]4m8Siaf0)σuџX%?@`)a:Adoz6+R{D=0w_)b{OgŴ *yۮy틯x6Ko!o|o/4t.v:~r]v=iȉ'I۬c0 M2T$grd4&% ّhG47qKwV7fyoJ1~fkܿv^ˤBR!#*"ڌ߱-4Q%K3ǻ*/I ݟӕm?6c?'Q\rDkfCUtJ~\0RB+b[Gh!,|Crn*cMnXo-~bh<}\>cb\^.vg:rm_OnÓ >Q]oJLtBe-Z~hNeo|dyq%Am]v'83'Q20ɫ;n*~0@t{H8m)=֮F9IH|u_EuĚxT=A`D>"0m %>Q`} rn~u=]Q%<_$Snfͯ}ps1&p捋7 ɤ zWbSpuُ IKbeUn,ˏh"gLQ1uxl P!`h+.+^2 J(q2S2 MD ߌSf()#jVHmf)JxLp', Zî$D`12*_j*i}uRP-JD4ь¦. ?A"~\JFK+JӞ~aVMLeԑZ#,R\<ȡ+Q/s&"W5cn~:]RZp"mb)r?aH8Ijc2Q9 >23c=cEn3u$Ep ÛK-@ Z+PO|7c@>Z+o{C5=웅gQ%*DڞV[b1rfiV$qtSĎpj2HXK-)a}5/ 3\!2o( #dB&\4c%gX;[ijFRoNA:f gAb$_!Bv:1P#Љ@eʨ1bF.$J GR lh!UHj{B`hG-)rF] `FpHFaE ;I2(=ٓFCP_i}ՈhZ@ @o-7Q!XG&F+KʞR&Y"zPG@PFB-&cJ&:BPڕW'FQck!"DxcaARKh (A܉ i<J% ",.;/ !4r Y!IR\R38 W!Rx'ZP;9wЂFaõt%GTB!"TH q?Yr<$iZgZBx<ۓd&كAS@RDAAqeyF MK A H҃𯮯+Ðٚ/K+g qM:Yn .jCAAɅðJXٺR.!6 >~B *rVf1)P@)=iso)HdϽB t{ېquyPؿBFӴ+!5q9{G]Ɗ '(ÜS[S}fzO"+T )<4r]mo#7+6!e_.@%0Vc{%9"}ɖldf[ocaj5U*XaRX-1 5s- FYY#9Nl3[U(ß}${,)S7qzq]J[hc"9YՒ nd%QY6Zߖ,5c| SERqɽ%#JȖdte}60#TK yT @@ŵl\VR,P`H1\bXZ*jW mݨg8zһg-gi_ոܵ;4rϛNsxy>VөE9ܦ>yϲXzQ4烿F]}\Ĵ~az׹Ԑ?\+^xv>#s3bM~Gn9aWMmX;E&+ wb}~8}/w ^Mf+=%!xiw~&7IsɜB4K``ô9Np~ OO"%lpP8kgbMwV!X C䯔zh]Zŧ%C\5ǧٿ$CύK1Kr5q<7(C%[+ŧv._^e@g՝JV7bftE0Psvci ^ > #ԐNV γ29Xqw(4zT1 j `TL'/, ^qȠ2@-*vE l} (yh:s(djBz}OVJ-{H+krYr9$y*7gbrTRf秒K32z:T(RYϞ|Ns8gb]Aqy1Q6 KOj5QZ5 63q4iagZ>LN~~hpv'|g7OҘkl"St]v:H4YshAU}z>ULQN_l7Uw|]хv(Ln͈0ԥ_| &W7NI.r{/D|j35[&P5Qt">2ϨN_2-aF+*c%{Wd Ir&]= IH,.e\F|7ƳgC%j&8׃},,dY]e?Ifl&q1mg?q >ɣe=qb"gxEWo*J|Kfk 5Pΰ*w~a[4;Gp7B8op:ppoTHZTp,DO': tUJtRE6lәT_ Ql*˴!S~Z՗T#Yo=ii)#eZHR-Ah)#eZZS⤥GsTlܶg-eIKQKEܸ-pZ* =~:\KPKq4簩-Ʃ t(iL.DJDrEp b ܫ``s TC!C$uu}NbrNST}Az" ]4]x/4 +C#軙@5g?}Ψo:{nz/6$4[@cǻv+"]K`z[`]@G7.,v~7o"*0 O)Z3ƛpsn6*aJ yj 'm&#`(!OH6! 9ꐝk•B~ϫWY?~^5pm@艒La,RF/ydi֗D PH&3B#\wu*42z1f wbn&,D})|9g=MxxVGn2G]: =8j #+#2TㆲKS7#'& ǩJN{#(\h1;Kl~9˕F7%lbs28K(z,aFF/` $Rov!m, ss͗qzanm,:Mtlg` A߂` %#X,,cUE-%Q%,Vk̨%Ůj7p ;WwaYϽ"| J豀ﵺD.|~|->DeDI8VF[(֌񘟵aVlW(pPɶea9lOlo)|ZѸLL/ER؝L0Y4PpRi/ ֈ.+z(jG'J2R& 9߀'V (`[r0( ?~pr!ɖO法Drj9HMl'|U@5y{͕/ Y%lwVȅOA/|ullq\![P,.GA(Zx"B~1"XPe1Qo0Nl4jt$H~pᄩޣM2̌=Z(zk lWL<8|]WCLS@e ,@!*9g/*'Lr_F~#9E {&:jS^ƻy&l=8oo2cp*rp*&D5Q.e*3:URB*aD"YD9g)Ii"Rj9kƔ4%cO_aɳ=zy&ds=>%!dٮ,[G֖JB} "4Lނ߻5=F`d S&R1 ac8)-Yܣ$h..\T"KIkl0/*20%\P6Zʽ6:͍ *I+%00EE(zIE{s1vf'5I/V踷R 4EQАL6er:t pWj47hOIǴmɒo?]2\#i/Uy0$%2 kU8yrkP,g?^4y%p 6z$wJN+U\,DYIc.ZNQ&#=Y LYNp֌qbLx ;2HN)V=%>܎ki|2K@vi$6jk^'o' -?/qG'_],`>}Z ~oH e[N rE=8ѹK:CM9F!@HV(}PQzjEIG|O4Hc/:!SLkmx-TjTOQȃ0$M@F䴃R+՟[|ѶwU& Fz6lT՚12]YئvKA%;r"AO;n:opVX4ojCF뻪?嶲a-ޯ Pu 4.y/AN`!T(C'_J&(j$hε esd(" #zrnz?U˫WY/z/F8EIL;H:{R4bB`uYOPkē]!2m-՞c-U0}wMnǂf Ѧ}Њ7ChQIt~I|T(XDi#J,D`BwFhis t|qB ֓fmhѲ[ԡ"ƽCYIԓHB8nA$9 ͬo+,}IHpũJl]*k|-ʤ= F$Gl?4~XKe(,,J1\GO;PJ.AG۷h)<$DiFN*̗L(=*jP@;]v6[z !̔ #ގSH1#͟&zCqNToQ4]6Tk^9{m $T2N$},V *6b .,YP`%Ƙz~`Ɍ.= (IOX\ Qbg\)\Hi);Ri'rtXwƞC \Bk.{5؂!x pSbdqEKpWቇmuZB{bs$asf,3F`9<6Y|4!w -(7˖ˏpd0mN~Fm'NmCwKU}O1 2Ա=?OyrIlǾrZW=XS4CNɕYs3,J?Ii暐bnsxW<d{!=R @曲-Q|8(!auBzW[9hd =8nX\)*<|e*{O!8,1TeL8["$@sτ[<{ *~mC)ŸAT@dyNgRUE*)1B X2`Zp%U 4}^c#gO9ŒiU:<"Ë A"XUVO!@RzZ`ߝ Ay1Y7a._^'7ۋztcGgt4o;Ϗi%ADDE iYN%~ʌ*Z؜PSNkƸMQ7+1TZQ">VnҡݦMRpOk^HF ɪ;MW4MfDi C1.~}eoW8H'rm/1FrpwnP/(Gov V4y_Y_}~\8ć[n٭`El9▯q7i>:lrf'l_ IsfN/TgLWkQ6&xq" Tx~b!*4m50Gjo* 1n-Ts-+ ]Xh cy0^ xMWL"C?5[\un7gp ~3I$aK;&'oJ J=\l^'x0̿6Lfˮ֜jU%4"rsSTkV)^[ſMXy{,y%ñy8<KV+V"KU&)Ha=@Qqai)1 ԺiOa+sʪX;xsTcTEyXdE%6x>ռzu8̥M.n{7erv [v_-NvqsUϴ=?gɛ}@䮪1 //nܝz4FS~uGykc9?sOc<ݚX$0kn"Kn]vULJXCvA trHnA{%ݺn#[y@h>ՁvQn<3H!)cnZ@ p$ QAY^Ss|VCSg߈x; !ȥL~8_&pOGۺ"ݕ3a,_6ks$,ļkۓ_8 }8͡+1s\vL805&5>5ەجb-|$rŊ9۲ƇOEf<&_=S1 q;0[%0E搏o>ݻVxkDABꑭLW?) G5u|+K?)C4"9F `Cu˯Q0}mYC)3anK-+ ClAFZrO) {]0F]TY&7!-0H"cQ21&_;un(a5bR--X%esC(=t٭z)! )axU2-^0g1d&TV1c,,/&Q"Nr+} fԗgTJ‡ <#%4ady ddhBy”ԣߩ@ VɃ1JܟvVB^8D0-C(sy9FRXp^@V;!))2 *qb9QgNט qeDF䗂DkS%S VGX1yvO=ٹZG#HgJDZ7 z,0 u,rl:~6j?4BW=+e^9˱"Ԝ6OBG]H6d/Q1Fp}.PxhLI] n[KcbڦPېy?՞m=JrS9> -9YGI.O~ʡW]uۅѮŸt[O“.<\JL:b F9 HclCcy&NO"=%"a/@˭OQHZP?LV?_^_O'f NXݟOp>y1- pdy[mniV_wchNOWqۣ ouyE?SbJm(xD :}#/A벐 d{˴j*C1E tNZDъI:ͤ>o-1C1X1E1HYKD/׾\3ƈYt%U4Y[Ksت9䣏O.dul/Qo%v/3o?)v p6`UQcC$"5+ ~ݫsBKw9UJSw ȹj鄙PURA僋(|D..C-V"g+TY*/e\fv+LQX/(-J .4B^8D0%#1%gT1qy2Ke)RS:eJSE5`)dbP'2y4kyqM> uב3%@Ismֈ&qC"c8% F3L8>>?5&VkL$/ &GC`0c1 iJgG$tngqM 76Rx;X`jgrPipj* 0;G#WNz :-15!#d漷s6Ocave9ŨRi*!!2jyi1L*]6|tޥx.$*֬z󽪶DH1H _$%iyG SH!mh#&:&350$+298Ng]e D9b0lV#ӑIڤlg1_᧎^\[VN,G ;d)ƗL {-'V!g m Lr+c D 2e K OA/uĜ;% 3J ab2.tcbZtڲ=b1a6Qzz- j p BLP Jp% !F!)^yP+w^ &4w6$0y|' pf`7Eq>n\Q~AT[I] Sql!/S ՎC1Bί(}(a c@cTQ F(}(oPTbCxn0c.9iaeew0i}=BRmQV”KFʛ}U$*-] nm=Na,^֓^[$rmiD(Pg{TJ}EF)eq(,RƁǨ>ۧZs_QQxJYhȢ@)q(Pq+J"JC)ߜ*58]_QQ*"} sEMEY ޿he0ǩ=ۧVJ9jbp0[Yڽ'r[NྮVWYkHݧw?Z]^̿nz¡HDi "ITIzH S}{Sp^KʒBMs3)<#"iW=yi8PN,)#DO )KЬ'+p(hIDVB {IDV"`ě4!},do OXJiI,4JKbz4]+m(0UڇB]) ⇴R%6/+MK 1R={Kii(9XZ])Z!6WSu%Zx% JF ~* e3Ѭ oq6Ll*Kl>T.g iڟt E5+ɘTHUZ+gwZT&B u}sen+q<{\oSE(A7't[n][s#7+*ma7=T9yI%gdS[l<-Y^%cYm6AֺH,7m(D+ EK.Uڿ>\3}!9?-A˿>3~o`@m(ע_L_gs>0rc5&J*AQuU .*i?QKP=M}9nv hK^@ TIypgSN0W3v^!y%:?=# AxY0p4L"B(ѡ皫h0>i[iXX#B>)7Zvqcp"ںX{5w֡( F曆5R8gP`sdh2Ege8aW*4^  |RbJ@rq |f8{%|;^+%Z-]Ťyr"9 JUc1 V5 >[ QUs#a,_MMP}zkҟ5_/z׋6X# o\|L@[ik66_CpC0PU.?⌬ݬ7WWd HR§|xgڃQLM쫶fKG)- 3G+Olx (5pZXI|]<悻EeВ[uPexxHUNZ\> 6Te 5d9ѭ2Y17x˧bwov]s(b`mrY'þX\3)3mUNåU9s2F%QKIѫrXT¼~n5x~LfU9g7dr98>wZ>DiU9PRV *g\#alYe*TE8U9%#10|05 \ p=GrO|Z럚|$sw$|K2iM X3'f?砧F[f~]MffD}$omr.f⯮SU_ `r;b6ޜ-pՂ#~fsb>Dӕ-u>oڋ^? 1͎ٗcGi@&0> VF@rP[tJZKA* %("D\)OҙaQA-(|t뀎XLb$D T!zXb)VȽGDO>*ڴ׾O]W!UָWAP|(쫘~} [C BFII)*Y%TT G(hE) *D \VXs"`LQ~1VwZW*g߽Y +ec1,d"N rxv^S9]XqM~νFi|_cL,n?6VƶƊ9c4VdIB)3|{],=+1R3zҋEc dGN˧"XX]G'=h[c&u!McW8rcEelO5V>r rKKiTHyO GOh"(<Tr\6aR T* J-+Z~tO[ԃo%>j̒O@3G 0 + 7Yݓhdx$u } 'ʲlaW.+ <3 T:HYrCjBbՆwn. S ɓ;dh9m]K=ňYNp|W…ڳQ%uA`DZpTC!Ս iN˴2dO ;\da@9a)Yp Á^aoIfA + ,w6F\̯X#L_Ч|#CϚ<ზx,mŭ{-ߜuJz h_79kxVYDoEޢp)7lϠ~}n/ $r\f`!Tw/2T]ZΆd_LKnP: -}귫TNZT2-|*8)߮Pɖt-EY(J4GOQvjo]>iKRrEZ*UcKqhST]./XK0-:==ce X؊;f+zҵ=}nPtPmx^?zo? Ml>k}l5# zo݁[8 :j=SrIK_NLT 1Xg[i[nvK^M+`yԊ~v2Blǭ'l%eւă;^HpnZHQG*"jjvp w3~&mtcF&p({|7߫pysmU[F+gTq*|CQC1ƪNuXCXN$OoJu2փpd$@aiPun'xmXXZU=B&┆+{.lw:$ (1u pOSbjp(+[_5h3=qtX^[ĢKdK9:z^K^1\f^îe>Lڝ' 9S̢~ Ƞ 4# jc8 sPF(׶V]ZN($`YK> l'>`}B$k@$MQLéʣqmEus)"*#G) LY{\' w *Ʈ^Uyj*?SSJ=:Ab\an"X/E4cc:eE,][P\۹MOc]Yo$7+ 0F^ޗ ,vvy[!JTE&R*3 /}je<o\(8榫^=,mөz1VrG|׬K.R!:IP>o2? xpnۤ?܉/azӪnU|sc>bq>6M]$_ γ3b`?axl%~/դRxޡ)6w_R ڷGxO0[\CQ1h0p4JڕW|"?7@\簬%䨊%F)Qb!^5N}s6{ٽ!c.2Xc7 GPel"#jST z>519[}sWvJl\xСpZg< WܜjkO@dq?QNIאsӿ_އ>놦i87v$*aT](X]Ԃ (%W*ηajꉨQV=ҕX镔PaX-ƱT*YVBc7:/lWׅ n3eFRIP^xx9?{QA*L0OUF`?DfcuWuSErQ)Pl rMx1 ie=Q\gM?l` n&#a'2YJ*scjbVH\i-eT/#4VJ*FXx)L*$0nH\os$B*$ZlaЮ ӾBy+7`GYk+)t2wߧXfCA\a 7Rﮗr`#: ׳} hN_7+K& tj4p*D| $¬ Ȍa qt’ЁТvl^EzBGgܴpCȄ0"1XDY~LLM1jSpsT-=h2lc3Tqj龾BMU2zx_UUBJk]޻}V-Fjd(׶jӂwU(KpJ 9rk{g{yWcA@`NډKm:d$Bj3K#%jCǙTLN{~2XF9Ӧgf/5H};ߒZYܶ>AXɖ1kSj38$:Œ$g}J0oI,p:roE]kS*JM(DQ*3N廷prH-Y :Y:J/1LEy&|ztQ&i]qhCaV&BtRzd!0IԁHeVf7A3fi9O=S)J"]hŴ87F4J![pY?\}j[a{=Ѵ}hNYA̾ Ɔ鏘b$jޅ_?_~ A\6P0,sVˆʲ*IFr7'iU2 D50z½ qEe%ts3gaIi4;)Yr3^!P(,N*=OOu{F'v :"\ 7CP\Jn&ސqI:й?\ϋoF[uζn)O`knbiyo_q~囶_n^P^]|XeNB!i4iex%@5QgKSq bySF Y/F/_/yyHT jrn mE]Tx'jMd:^-WnAyM*絩\a\(**4@eQ+,TŪJ4FT:0(ذ=nzXf2RQ[cg=aHWF0zZdVe^RA!KѵF 4Fi3뮲a*e*cZ)˸yf=-HGђ{gci3%$b)n+k*bBW5ӥCqmCZõRUCslt3BQ1Z Js n2Oo|X]AzSVj2ḿ~LG\_^^Njw_3` s"ӏlD ݮw[@a˯@jʤ ?[\7n`( D}2~xwz5$2 x͸w_z]/E:4I3*CVACRwLb%! jYȌ~h=!@y4I\&5?إ!G营FQő+s-֥7ܷ9 S7U*iL  Tyv=.[jtgr~q~wW1j&Ѩ' 51'"qLM:0enJGc(v{4f_?=w>ĢE*(Ո/ ̢F$x7mۏ2[j0}Z~ޕZ#u^:A'j9j΅ҠУE|;'g1>bVtSV9DXN7BtٝinHn)<䙻hOb8kuK}#te8z,ܻzn3g[>zʲeFD3JZe*L3u1#a8 Lghek:<+:\~ל*CyMAM#MAcJew#ka }h,gt?`,OߒF˽^- sp:/~f`2')N3DVMɡؐO0BjSKfH,yNS'mhZC!{Z).)!utQXc+(uI$i[uyc*ϵ,*2gumT٠ U ꦿԚQ{g66H(3:U U3TJ#ThSUEZ3x慑uY䕠Bk63GaBGł4OG샒%݈T%X $KE`$ d]إF:OBf:$MFx$I/OTbTG5G`j+~#Ά.qڔBeTRhB?B/?Vh9 r}~>5~O&H0$G7n1`XKE1ckhxP*>XĒ gt|ɘ`l#4n9WMS:/`yNܱem6"׼0,o ȵ(KU=T&{x襻cW+J?JnA 7AK\>+%PqcU)fHu%`إ;nj!wvkmz{ye7?&fEsfn󻎘ט3icS PBUS(* ]u",27L\XkZgov h`oG4vCa{%zAP\0(9Z ++RPVbUUcbvQE\ĨQqj 3QkzG }]h:XW+wz \Oߟ7w?[:M+^]ի_x]hg<dl'_1_٠ru}wһƅWaNjw/!#ƈs`x̯oĩÌ8 C`h}2C.@pwqzU>W!-oʷ_zݝ/*9N 9;؜]p#qfz $fm_޼4$e1lfօT"Z#Øa7}fw,ʂ,@L>JMO,ʏ 6Ҳirk]Yݹ^Vn|}EnLD(믣@[%t̸M2|+*_ jvs>ccL| K֧ߍkMVM#5MՋwyzӣ[qEC,Jv3dz ǃJkԥ)3P9ӵQy;h#GMwM S<қitR<OR# t(x:h'/b[ =D^Bņi9{zE4PԚs=GcK[R֑ӄmcus:!mѧ֦6-z|=fvt#ebu:nS"4Zt/!n)<䙻hǧޓƑ#+B,fY&Ҁv/ f֎,%2JvVI%1Bh[NgEQ9MP?7r[]n5o2 *@-jq> YtkONJZW< )} P)|R\WE˕Jog{u7 w|w^|8x=1ehoB[| 5A鷶kc3,T%ڣXp G}4[u`g4H68}NPNV*Hyҥ;^/-U+`>ˢ?eeƷek&?di(aym_@/77-D-zXܹ&.$k)_1)"M6Xˏ|ela15h]j<"׮#͖o<FNA+0^&F:&~nb!ŞfAoATY}n&V߃ƙrCDC{[@Gc7 SbsH;]+|עlny;m \ 02M}Lc4mL46RngsRV:7 >Yk}ᗠ~ UIRXH*[' Yuj$RzRNJqQ3$RTuR21O[J5IVJҗ~jre}nf =.u#`ԁ#3Ys.bS`'ȶxd B-nBm+n5 -^&$;:ųU,ȋqj#RZ׶5isғR:)ZإBJꤴsiKi]:4Һ<^u1SRSiMe9tYț$O[ԳfrmLtN<{锌Q#aewAYΖhHөJYc04*r'9ȡEhDPrcYƻv21Mn|R5;a5RJ> jڡc'j{ ǨqƪEAԞ70)uЙ7xN1Y˲ 9Bk+n"roիWdM78tZk)?:v_TrZO巭GC ҭʾ3zSEu+g()gf.Gdc$D#BZo@"(mKM9J"L m@C𖻰* [n.zHՂÉXz|zzSw5ZomY1IM?è_MҧTϵ{OyĚ8S[.hq7GNQ)Th\OWĵ֛ms%m uGOzc٨ rÀ <8@˭75oxƼLZEUJ 7p:m 0f jK|pB 5y&2 }2sWBUv$on&*PW} -+*6x.ݗhb2Prh*8cHp$ l>P68Ɛ}Ї\]iU_ogͼI߈b4hCl%a%cL!Rl||#5;A 7P2)_{0iPw=foQ! 9Ri+zҸW7̿.5>6%L&y)d$ad$]Bm%a Namt-0cUXq.腒ʑۀ bc!xEx:&E73m&ADi=z>rDY!3K!`v֣R~-_{[K$k:_|zWڪkK05lyQ{=,=-Һ8uALn"һ81ȁ#VFw'E{n+n8_䐛6 :h4DCJysj%⹙@~:‡_ 2:EaUkKIo.ƹu!y5Y.˅[Twms<]<>ܿ+.}'o߾y#DO- Լv]utWbGv]r?ܵ777wcl:%Ȼoe߶jO8@f7k{_ty!+#sBrvV]1eHɢVA6R(La+WcR&ԪVN8Ӻcڗs>Sg4@{ْg?ϛ՚S@O+$t+Iz0h8JJ\O/?njw+-oM^zu\A!D>:a~{`tQ$:~Ԫj/WWj%i,Ͼ Ց-ůr^\NU[bݸk%c_#!6xvDwΘRO-܌ɥ$lߤU})EgwTֱ̚B цl Ax%2)sp44ޕ,P:&|S+h5u< J{fܟIpM8P[ӕ:BuqG)&8Bܾ/{ 4Zi!C $Ms!|tcc` $\ raQ0A-xTsEj͊ )}3SGx:y37ZZPtv:~ecJg r_C?_5y*>BI_=xtW_iL=\DZ*nLY i{_+N+JQE Д2GQ~$U07w|6؃T`$^LmZ^_Cugl W!(tGe̖qlTKHVYLn[Ь' yE)Jqܔ#$"(!reף"AK{>uP[I;}"{qy}õK0&QF{=,}*{$[)3҇FtK^vP=Qu̲S=JUdS>?z V=`0 Y]奊7<[ ,F)+f-It^(ND-TC.pu*Ƴ,|1$qZil6?ae.q$5i.lR'%uj.c:VLK<{HPws޳[}eH| >h,iU0bploJ;p2P,} ƣ7M"K-x5f'Q!W*]Ͻ\GxghvKT )cᣛ#^=O[QC7`),gdؒWC Z[=]9XHZПKg=pѼ|uKt;>䍍Ach"a9jT69x6bu++^l>mLW#OȔPܺ_[-y_nY<ӫش,dF쩶̙~ѝri8@bW3j77r 3sAw6qYC!˕(L SWIZ.tk_?Q&8 h85œ8UG²9` B`1hMJl͌l*j;=_,Bq'^lF'cJMJY[d AÎHla! X|A}A*l7O!bo[wX=W\/hn'\>pXnBZ?ka$akt kGh}ض $ٟ/k:f+d_UQb!ҋ"e/G_Z_zA_4NF4 qrg[I&s3Y,.bߊx?,z{\.vüM7/?ͭXq7TE=Ma]Q$1D't9򐓁&E΂QN1JǂhYHI4y7NѤ(Z8=>m|y GP:j:uM${&rԆ圳P2CAS*rbXL0)c)lt':lkT9$IMRDֹcRV!/sA{?+s0bݛۑ"ԴKp4g!Mg1ƋE6^N7Eƥ`Y.@$]9t+:Pͻ_uv F-p0UJ*m.%++s,,g4hq欫{ kI8ؙ. Srfʧ5zH L24:c=)H1gIxaldܓR* %$B\^m焐-*v޵>#E/w~W5xn撚LS\ A& %) (J쭝r, ltagqc hL\>!#X0q;z.>õrJ)+$JzwHF\$Rf6T<qʔHDKk<;}Qa˷<3׭[JΝ۹Pd\1xV$I1"ijDk%JXa14BdE "Q^s*Ћ'ç%r-ưQjq7Fը(oOBw| ܶ!`F NaO)cMrQ궈YN,~rCDj2*!5`C Tm ,"80*beYd،#f2Gh9 O% \j|L̢ب5Y*U$!3%1A:On8S'IKĨ2Z=r~#+ן6zLśu[U8]==^%_*7.T)A4 In>$ןyAyV|&n/1AtRMo``ۇk}Dž0Iۣ(z [P*,? I('x3"S'}<[ׇ[Mn<$U5Iʈq=蚽c gs1,:Z(ŏG +$!!8!XIX-viShzQp[;(kwY{X3UJv|m9A,X!ԓiI!7r!UJ.88T2)EHq' ؠ!\&@1l` iWc"UBE~ &=pKP ڞ?յx*@j6J9 |2VEQ08I pdf R0[@[;(F -,\&]awY8`7^?ܖ̿Q)l &SpM/fe5$:]P´UR+LdrʸD I9pfB3`EAԋwwkz7(ϳrhA^8YofWF9GqqY:Ek4eB.oLr qV<-tf)07^ۦ&ȥi9u++X ̅p4w#^3k|YP{t|6;?;(itŪ3w(J15p!GSW:kZC\etcXN<;=D2r<4T}lcmϽvX3^kC-u73~nLaJۻIrtR'&#-v^Y>׹w ~Eb0RM;aͬØ fB6 Do}ީ}4J^|}a,sD ve jNu"uHet}H]}МjDuthusĔ{(BsMU_or9Broͩ&@ܡ*Z aჸCjܑˉw7ebo}љ}Z&K:-r|E N-TDGC4\{z]zC;J݄.[ohN2&ʺ_n Zvmz'u{N~ Xpƻ %zo?fˊ(ˋC4 SlvvUKңvtb@*Yn{ڭ |TP=]anC=v%X``6kY ;w/fyǐY.RkBZ+88C?3 wz0iꪪ;|s)֓JaDL#:!AJ\ߏJUb`:28u~0 UmypjShCw>0r"q;bQ6R| 4$Cynr[̌2qNXPtXm9Y" SZHUޜjpP8I&YfXB%5#0jip&hL*ZpSG"ZA ̦ Ld 3EZ&!a#Hip*7:C]{OzU["yU#AU[zӧ S)DwU[ߪ-- ;h ҃O`0wERwLy5'IBsFa֡TJvK->Љ}Gvs!NEOW;žDcL4i26TYRhMڂr֒$bƂJ`*aYS"V!wB~(BgI<*l2tܛGe6~;|qb v\&߈ѭjaNgٗ0A@Uu0\ޟa"ϕlV2m?qnٽeTγa V&w$_/b+g޽ 9oe!Trk M&4Cx’ML$A9W >O֎LOc&BFM7/.S*kCWo̭q`f*2 ]^kI=v}.3 4M:u5:8 *(!()y*Q"rRC6!w5GuTV^kwa #3^t Z6 9[QmY\nBsYma\?柳yi)r-@9Fϳ{x50~Aهq>>-jn~~. _#~k9V/p},X0|;kN9J-_4B@A@7>|F˧x0ʫ%TژZKo֎X0}vĂ ],刢ByBi>p>҂X>J ^z)muU*iQ$JT;stJZFQqwUPj(bY#Wgӻl+>z6=[yw+TgN]cIjy_g* +2ͺ&_ s_&Q%HdYYth~w&L2XXbe3nqxa&.-9y(yT.?em{ U?>.+LaxmgY՝\+?.:!5W:1& ńJ$'Nc](9nV(ص "Z(\JjKh5usDQߣ"л^dӒl8Cލ $Oв͢VoG`& *`jiF%TTgX錁l+F JsLDn(-ͱJ'\qJSxd;UxfHaTqNu >+e"UFeRbdͰv6|2y+wy]6ZobTهR7,|`̻߬Ǻ`噋l%||\N@pD9īyV|3%N?|\CRi8`qdJ`];uƣkX?0 BJ')l11%}ƃ%.iN%_ǣ?\A7JfI[7 aLr'l*;ErBz2xO\C@2M{ `e.,)y18L{2-akg;X;PMy+q<zwak!JMcԳ8X8Gh`} l$]wKJXI'+ܖ7vB{ɾ@!T;&ODTđahC}e"䪞V< = =a*^5gڭDjs{`* VPAV0/!5!2^"e}0zwb;sz'l3$v}hH9;J>K ɹ wEAa*ujOSl)|6B4pNQ60͉#?A$p6dHe=ⅇ"Ԟ(ɻm8c9Mti *(!3&~o(qČ٬r<[~f AE;.8`',F-C1Fl `Ӣ eyLDUZG$SsZHiT31`r{14砂b} ;SDY4~~]Ե`.=? 3>0e}\|:8?A/TIuI8Z/&>_/ѧ\-% xϮuYp]~qhvזɏr6_\P_XY$!p#S=Zռ [_Nwny3f&Z!$.dk?=g 'n}i":}ƺMEp`GnKцL%gdN$8*5V)+lmgf犦ƿ !yjBy)+;: c,g0͉vT3OSZH (CVҠu&]L[tZZIQHR\ ZDb&%Ԋ-UŽC{RJTP+NRzRVxӔAH)K =t?f)%(mf(beEN7'&Z -Su%VsI{PZ!q#%Y,J`IwɃђ*^LM8@OAB4\ӗ7H)յA~}SulR&W3̿v䬸,31>k}v~RgqlO-mK8r`/jwwas˱n5@ce7׹_*?z@*Il7m&طhhz F0kn\*7y5}$ 1 u8hQ@D5˾K8X" b8'Qs9|ġeZZUQ'}n ˨6L;2Y+4kEժS޲v,pgu, ȹ45~ld¡EjI|&qp;<'b|:N¥s9XXFAdZh&3x'_MkLyVV;jXa9p򈢽 9 Y&)Ħ1X2c\ 6 cqNo(PƔ='7 op {dtt j:x#(Ҷ㸮rwShJ 9vm"TXIk\81,;Y9MۻW,@Bww$˕W?=**UTƫWue$v܃ȅB+`Z3)d+)7B#QbVź}_t1j) (jV}~yC> QaBM~-0㆙7r?'9 ́7Y!.##N1VQfTB=bu5 cVôF+M{VC3ҐYbG  8GyP6|4H0=1zc4 m &S<:5!i6=I ZOg&E>$rF?v9<2$9Lpx$HvTy;3:o'=_+}[rNxߜlVvmٶV>90"|wӣq;CL/hQ*շhjޖ>Fs17CFu[$̱$AFg/K9FYxhB$@"ì qϤ⭶F2k 9O_8jCvZ=~Щ79NDԮ?ĩ a*QvhwYN-Cb91P:zKp [@,gW- LJ}{ #(:Ǯ'HHthѝ+ㅊ݉*OWbVI%CR. FU1K$Td|7\ Ʈrx;m įzhZ2x:g<+CxCaRD8emM!s_1L3@|Bw'!bVlЉmlI-Rr0LTT+$Dcf&(cBd9kj0?D"QEY&: $yHB3JP[)+lYHLUе@.G<6rL0Jxw)HK.}waTw!P7)f98rON!"9QꅼS{s$:481W}e8ql3ߴ58XkYvj=! /Ill0E߭@НpO6u!};M Z \ot1mq˃CLʭ-L[Z`I_;Ep h܅C=5͇")|jBeF+0Īzm W+jS^ץH g P!B؉<,,$(m^&gSH"\pD½[ˆ8&v[oKj4DL/8LcƑpr i81#>7GKOSYڤq"Nlq$[q\Z~4AN|i:*#.h9sKQk-._q~u[0Fb(\ q09S 9FR%5s 9s${%(T‚n)L*:jCoM[;Pgʆ>u s,L);JRkן)LB$[_,^,WKޯ߉+G=~{DP< 4h<`wN |z$!a߭ooZ100e=` :+'}y.*A I?`Y? +R⢯8nIF3%<܌D"ҹbs~;>!V!3z(+eȉluQֳ\YE@tLkW|)NaJ1)S+hDT~X6#W LD;o?h=pQ]bGk1R-ņK01`EJiWtJ@* ׏stmHE{mi( #Vm\A6iA]4CcQb77!:RdRXhuU2_JRsPJ*[[,\V +Ea7C嬓K: ^KBi^4RDU MSTZSB[UIJks~j:DJ p궡evUgc4"UBr;>kJ\+ Q96f A2K.J!;Q Eyu% Gݖ`dS6(-/6$HPJEhD,um۵j$>؝ńF0~:o^QHy*woN4nA q2qBAzPUN6ҏdJbQSh %ZXnM Ua=eYVkKv R1\L}ѡUs$tGMM(p~]#/l` -**>WrJ{ҎӶ.BԪ` i\ [4`h<:ǫt#xoQQ27Ԉ4,ys1tDNi$8Vr5o~.*c"B)9fu 3TgC0@Z?)@A*9JhP ) ⯳{r1<Uc@#R9C9M_FWm ?ZħQ}0gq` ~l 8JM]DbSI;D@r|ВS4 99$B6- r*PTfRAM_U8aU5F${,"Z+~򫬈8 Y`2d2=5W(,0s+_n`tN2Fv=]+uvG&ixY`ٻ-hr,G?/Aj:Cs;ҋ('qÄ]@`#rLU~=MvZ֢/MpBӮY4cfP0 d(r%QCu!6$UгB}zK޹?. ̜.p^A nČ~Xf8>Z Lחq螟x ϏtA{KsV| t.VZF6D%F') c0@U,r}LG11qLЊuG I(G7Ȃ%Q%tT1KN.N:*s6{#HSìCH( W.qK=%<_/Gտ߯Eꥣ߻w|~}Jɀyx_^G#.W'sT+k!h5`E 7OGkĠ#=ӝµnROF G7 RL8e#`1{}0 WҪЉJc&#оӄ=2D^zy3\s (@>d0/cOQ9dOF}{ bIgZ2D3j)ar,\@gb>5w{6I $|td 5d: WА7):E;coǺYO0!6+C2~[:yuˡ!o\ESt;zT=];WiZTN7b۔Q՝9C[!-qSB771hRiE٩t_\|)nU^7{Lߪ&?(ʌ{T]3HzWWi͗mi%D)\$z U"I(=bdQL8P◰+,H&DD="ɻKz'.1\N!);H'ۈ;P,Bv3Dⷮd@< ].LEj#IYwc>EK^ak[Z$Y %=7gmj[1/ 0 SƨʊRkpuԬRF55,L *~ #9TtO>TttrL*fӉ ~$5(:_ "c" HS;U XiO ~ؽd5f k o'rN3-MrzlFg1GGdd](1[pLѥL1}y%"'Ł"'#燓Cߋb`#{Vb xW|0&hlDzy(;Τ=D{(Nl0s5̑,4+1(;?75-؟j)WjTkօQ2'-/ψ6J&ʃo Lw8XYzht''qM)Oޱn<9<"9^^ skBgА7):E;;btBʃIFu.~6x-qzςoss.>4&K(_W3\ 2R#f25Axޔ~+x*K#s>XHp_(~2"ُ-L g!ɠ`.+=ȁJE(E jjE2}O./ef_}/}lw<3WbD8E} {: , ?X,TMU7䗗R`Rad☩2J+ JԅiW_(7rv7SuwwWyO[un$EfB'b%b&+fblKR韛3)l 0\|WK2ﮖ?><﮷?]oZnwb<\iJI󧹒I<k mݧ /\޿ߔ7"*Kn%rdߟ[fΊC^ApR8ʅk_ȢAkҁi8)%ȂVЈRXzO(j(`Ն>kMr3eҗM!.ZzZ &Td/^E^zr4,."}`Ewff&B)ּ IV)Blk*sS!8aQ%Ic0kT x f:T=U&'L8GϑelX^qD 8JKZifӌqLDS!_lR[+LqXtmC΁C! S.J`ƒB]ay&DHPWy6f|ll'!Q{Q>ɩ@۪|m6r5Yg)3/` 6W Ne#A fbL@0Cq_>K[3}^]X=- `ï*Poaj p3QV6}ϗp}f<1)7,3wgw.~Ph~=~ 1L OM컅za<9J\P1("}6]Ǜ0Vdlx!*B>1tl?Ot!޵A1.y,ok"P1HG$69rH1٥yĶ_4dGQ=nyК0<,Ab&/`TdBUh)ΚHMeHAvu?^>TXeJwhH=1!Lqj}:{N0r_V"v+M:j)Zz"6 a2-mݩ$mӎd۟yD!V׺ -=qPa]j[[_hRPpòRePo1@T c'ESا4;7L𓷓}W&) U1oY6{ ^rعo'O{y՟B0;P=!n*a:>YEu(䣟T9B&˦HJd GW)mז[77^6eE^o=rv؜ؚ5=Gk0B: ;թ Lh:sN(9 : a!snSRhz}S]v1ipF9j)DDSc}+Ek7z;ء@}iKZSFء$ޯ/yGr^-%(e iWR J1w!+>FYAYLEjC5, f}|0k|A%D㗩hPJNVzV &J!v&-AX)4+O{*⊯<+}ՊGn!MRƊ  KznI5Xv BEZk;4Te㳧#vuǃQ3δRjSbJ}YK)2$ {7;7*6;I\+'!u. <cǐY?wՒn~}ÓIٛ_ۙSwgѐ(Ptv6J.}ǯPb6+nc;W~~/oO`$T0E]b|W"1SƒI K,%b&.< ʶ[Ҭí]Ƥb'u2ㆾ'tӍUgjH0r _6jG_@KL%J2?^^=}}[-HwCX7?u?e4ғVX V楉@9S2s[Ph# ɑV(J#M1֫O$i4ƀRYr4S:M@ ̅ 8«2/ D[8MiEDoq6H$ B+qw[6݂#k 7bAڒ>B,rP.yww!&qϟ^E@éus2nQC:diLD) ¯a<4“`N2)x9MgE`6L/RJ2!; f!ס=Uc=f:3QI2L Pehś)is!`x55?19&́mJҬh8u ?n+ĕ+(;SKQYjOPR+J_Abk;OVz|Vӊת禤]YeI ΅הs(([==(h8ZqAqqT_Qi8Er|i[xVEVT4z.{}ZiRݒj vJY)UwImJYiE1\T$FӢjA0T@VTI]cRHYL )>Ab!<͐D-`[Gk@ڡ=PMv߻\h {A.B-ZU&VèjGܨϿ+˳"r$1q_>-bn>$񏋧'ܵwlY%m/!GF/grN}ϗq[.<@f &lR^?*ᱺ|o!' דLA6!;\هwSC͍p, ixsƊm/J"!qpFj.(+.Ӌ4׸(Kz8nN t4j$c/u)ܔT(g i]o0>4hV@'6ZL)>Z6e"@#@HJф+5pQ:Q]fY|)I岿6bQl`u9H$xøuA L8enRR0#-9ƔgZBsC6rLIυ(HE@{@X|ɵ@dX"X!ҹ25_JձO=.(~^5XP+t-ur`K 6ղF]{ڱLT F;0_ ?L-F&ҒXDPiEx [=e]G[[>!qv8{RYXO>fwopPSZ擷WO>;ʱW>Qed7a{6ųLJU?-򶻐cmȳx?T;^Ϳ۳|ybq?ǫټAHB$1Ui,]]}qW>[q<Ͳ 2(a ͵_W"7 qnJ]9xTcN3@w]cNtg0NىgM#2ЄSҬ;;zooLHTW$1#,泾ֿ$ip;by&VSl<5B-ɂrgXi1ZAK*Ƒ"KpAzdXq{_."ȗ &mV"~H I YS"Yzup*sPqit IIhtaֹ̱a$p:uTX{ϔVy +k4JՇs!!06ҥ--xͽY X0P$keTF*}lnXM\05P#ۋE_ja:M@Srzg8ϦmwE+>Y 3\21^asWKt6iiMzÇз?,q35g$M߽$\ O@5ۋTSoԶ~rPDE ӭC9Ht:,Eb}`h6ha&82X$K D F*] ;/4\ Q_nn`d B 8L.@ ySkJV,LJWUxQabL|nbKvc4zW+Ѯ@X$(Iya \?ݩ*.}8H|_0K `x/>s4Lϗ qOq2+4 [1U܏a"#{Kƴ*ηXO;Ov)5_Z7ME~D)':n`t{ b7OmI Nޖ܄uuwKdعsf77qp]R3Ql\1zE)E簾}Z_z>s=Tw1ֵB?(6T@b"툁J9 \ƈ vEEZM2Ppe` BFT-zzI*s;_L]u Ķ3n2o̴ۺ^з$VQ6]u2& h(BHrpL)wR e2Qn*y4hLp \( .kE:A<9,g9kA{xg-PgWݴagyъw]k6O @Dq+596#fbTͥ 5 f3i95k\޻6x3홪U"jNr~fAtr!Jryom-m{:! iR 0c[!IW z},J^_j%˷Is{}=LR݅i{~IC(jQA1ާ0^_k5}By_ 8mcラJeb%HWocsiq{r^ ۡ$Oh`Μ&.?{,ղt}wuJ{|,<'~Y}j_mw_{! ]wg1P,h`i-6 @dU82*OzRBL ba^z Nob*`ss8fSC(5uL.i%0k=r!\el3<a@k)xhL=LyӧQ<4A$ieM>dZ/@$E `88q- ,gudI 9 yfzgv3jl V2{4Kux!'6*?t6ʄѽg~ }sFt6Ј̸z/u5)u]m̌FրAgdaueR g?> %f}z<W^1I։,9tI0-b'Z@˳# ;BqD޿˕O.+WWx 4ζ5E&Uūj5WQ,n?V2إiA~|8/?vny}[12"4&lԒjAqנ˕C#emJԂ#KO+gCZ7mW~V2sgo6w\{k_ּt5QMpI >KN$B кCURȖ|puZa6PqʘET.c=',4GB 0/Ғ5D*,E2f uiD Lqhznr %RضumqbuxR F^ Gqk6l̂(ffd14 "V]l +ᢕBJHbl‚ .jU(k$<|eXRRv#v?şd $Ǡ9A:|ʀ ]09R2yеVASl0XR/x8b DbXGDw;Đ lX(kVnY.oi=UʿReJ4ȑc(1{#Pp@x%w=ux鷿Ś?o[CFծ~Xny_ bS&!O Sn!|WW~EOگB(?f<\p'Nfl1QURALD+vF4[1E-U[=f'Ě!ֱ] TM\tKGJnF] ߇#%W/6<%P[X2ESiI ?h}~kP=k%(O)o(:[o:#D㍰VrDP}释BO?nG8 ʶh(Eq;8xЭ+뺶u{RaoPKez8GH0t` RmxG][jy)0V0ZK܍|[j5a%p禦T]b sU3c"i&@S@h.i))c@)a <0ROG=(29ׂtTAL"^=(M[6I'S8Oz&%#u.#SrЮfa%:6/a5). Di ͺ*O W__nUtt HLֱh_fZ ǶyY<[aC(҇qs|16419A"8=6pxT+)ZVBmkkݠI)o؛45xJx̟{dt*N6:+Qaϙ'HkUVזV_S3GV8Sl"l:~:afJ/ 3yJ'_0wHɵ1={P 8rCr` U [@L/ 4\s`MjR¨ţ+vgpN,X(Ô= H(b̲R I u{N|îxA4dpfp??1 fP,~s.E5pRӓ_(i< Y2"0>wp}̕+%цycH%0gzjZDZ,xW5'Ya\1\HLys|g1̗z{#~9p߸H(z)ZNRo_W *ׄ>} 6E{&xw͞qּۘ~Wei,!Wr ;ɩiK&/RosT%8ҹwf:W޵5rK­qQ^֑6Y'0sCQ<$u.I/0$! NZtReݍF7C7Qݡ[hx7M%Jw>.kˠ^2Sǝz;(?qX?*~IkHJ2l HIl 7k4! !&3yXzڻWf4<b,uDE:DžQ͂SJ6r1 hc-HLp15|M{"ђ!%Z% y6-61`(/]ZU¬։$C'37 f1ݙ vqVU& }N\dSI6ුuSCp$1ِH×&x3F >>DPJ42HqGB[&Kjm mF(B矉~f ?nfy4ϵG3i$ӝ$>,'Ջ$/6%xrՍflo59=ˎta y/Շ|rs}"?|#?p v2|"UA{oUc< |Y&OYbou,3Zw(Oo16ՊufQ iIC8aưDg$e9*\%u|JK lɂFʂK4Z'\~JNPf,c4"c(uY08rH#\׷RIpvS\󴝛ǡke&R5j Bo1jW Kkc')'Nx ?`: Y$lBh񜳮? )8-!Wg9S$wu&F'O/Ӥq}&z86ɽ $w)r.2yB?id Hoal<(DTiڳ39VwziT#.I>$e_':(A@L|j," C*ϵiYkpro81ۄ"mvy2yx.H^QO?d.7>J&+AuB+R%#ŝ:K;N@5!F=L (>ES͝7VHbeFPޯ0qd-b(QM N$T %ɫB}+1LO^@YN@/M὿ ϗxK||%=cF-v~س-xEI%.GsY E8 u٩H0AeE"%$S q$ !# Q@#d1F1: h!8S()RL) SjNpF(L&ۏ(ĦEIl3kźY :G1͈B3%mRP\ 3R `!ҞĦ(u3LV1Zd`4i3lHY ±dh32%R# e>K:XZttB&֢1J", MՀ1Rr2 m2'.,E"lsyrTYMX?pi[>=HNeEl?tK!Ĕ/_ߑ?b98# ,SGz^DxhR?+oPR?=F;~,?AH _o@!` R\ Fz6ai/9G (0Ĺ<-Yؿ+LlWcCR _.ٻ륂m 6 X{-\V/<TBa<,PԫI4A; "ĩw<FBg~x!&9pKD:- obq7k3y_{0~?q?C]fR[dҧl>}ӇO4oIes O{kր*?Z+_S$٬xgNTO2q]hUο~b ۨAC8lQ]/b,\h"FA@Olm+n=׹ ~E[\i:q])& j! 7~S$s"OD Q}e'ꇝXƜ&YNcy,l:Y~wJ:ՠT1})kS6ysZ\A brmDe)L8dJ$?od>)G9O^,?.˪/{tz{5| ^?S7o~FjaYJ5bBr♻Am6|rud&[?+^[96~nS @LB>L&U~}E]nosi~?ճ̲ ~O6G4[] `9 gow+rЈ?,у?AM#2Ψ@8 D !1&L 0e;R9SjN% P~RekY QAG7a 1<M(gsN!-U9x[EHq ',BiRw VLZKA U6㩾ۢX\fR,KU5 ڰnj-/ZzZJT R4 \8-el|l$OBKQ}Md㟷B-*qv~)DҜje?c-ׅ7nc/r t w T(@J:Prj^g] VHx-QbٮRG(>R} ;DRJY]0%B-ISu{H ھeC> )>F.ar_pxɚ5}2.h7H&P3y&ݛ-*CU7X7"ILZ4b*e+kΐs Q4M_xL9vn̻m21YQr{ +6X{e?<xcLOo(uVuͶǹbKSjL,M"aȩKHk({3a2#D$f;5TQ K+M{e M`)HwŲ;^<'1k Yt8 woc&[׀銂X@ڥ&qYʉV 4ŖUD̫JP=C W"ifI$P",[iFĖ(H"DsVԋXc6|,5Q3 JokxѼNYQaaӶ. E;7ay$o¢DgUPd/(wl Q1Uw O݋Dfbu~*\CGd]DR)oV*p-ZKPB*Rt%Kd' .qLp[➹1˾9|PlSҙNC˅.K,eq9i/lie\HiPp[hTbj$j eREGf:].!ۤW  $Y'0Pܨ f9lq`Hb^}Ԑ4)h)^/%(D Ws0&;QBڂ175H;:,XHN%cvVl{kg>`d[*ؖNfߐMQ0n4䲯FB 9(tq %bB\HrF8:f-*[򮒟'VTkl&yzLUW2˯uϣ.G=rC_&zy9!ѓҪ>)\L{Pw;n2j1PBD"X--(tlBNb̡6sZwZcE PO&*MA溪躜(;?*Ee׎߅>7h+P^'?>_L]Upv`}v`}ص|><%F&dV SAuJDi2S 9,rj9m?|~Y~,*Ret^@7Py6`U;&.c\aмDCZ{ZQ(dy?~JMrj,0kd5VZںEj2<'ϳWː]#=~=Uu2m( ܕէXtK+&r'ܮ܂IX5IaFN D}H܇D}H=E0K\3¦LzV:B)\M!@2)qRRd7PwZ,$ 6:\ڷcq{̿%ij 2q ƉS̆ׯVij;S^( Q| K`Mh][oG+^ vG{wq}YWn!)esO5IIÛ4Nd]Lxb\T QBOVӧ3\ߎQ%]iEmī~NBNVTp#( /`=sre{lZ[+#*+XQ]9k(/WitΓoYc1ܴn!sc$ JFb@tivV!:M8,oBv j+ig$n1m) LF_aɆ-;H4LQ`#6 n;k7OEWžC(t-CYC4/  sȃ]!?7'E%n<kO}oC J2$i`Adv!W~I[! lН!uJx3؛^ix?3Ģo|]÷?k˫hgǣqҷzRtd~)kp%,`ׁ'vrr>v'mE=n's+DqÒ \KvpRS\xùqұWΞ>! WJ ֺr>Z~-DxᅒJOD[PeǙ 91X3Tbs  *=D_=;}_Fe/dž]w ^T65{ʠB#7[ d[3MvHrI[V[;xY,QDQ{̃𱟧 @F|迿_)H0wo= "\pN%#0o )w/Dbx 87k" A{E_۫OI#̎]o߁l,dy}M5|go{7=ٙ_7?Mڳ -^ӦcVV^KVw+N)1QX@o**t0֚DVZw#?VdO- 6E!A,R^tL  %61uV*J4"6[{\dUvmn ,KfHEfAαCF4(QtKkO{U uH7@7nsU+1)јMްKPMc6(+qB]pѳ"LhC\>*Ttx&HU5 =m ymK٫C;sPk-{Z1z qT4A-Փۦ/`Ԃ(Myۗ;u(t @/ʪdE҇/9.I ϗw OakpG@-zY)U%i-J/ϕīmkϳꦨʹbw [q\M :ߕ0~TGPkߗ] }}2 OOvֽ|b/X]-xLzw:}syN.2Gȇgrb'{~O6(LѮ9wkw?ƧwmO KOwi޹wLR~$"SKDm&rtyM [$[$ @ܸv (k˶UkB4e^c}3wpq9wy,w9rǟrǟ%֑(0AH!Y$ubr$έ Qi⇎uKN릓?[~3>iDŽrA[V|l=ޯSY䠿9++As"xquQ&f&7+B7g2xz4odqy4yv/檁4滌7 ~g#;?_^\tRjG. Pu>Sͻs8҄Xhnp&4 !S+ٴtR7t."ӊ2kZ͚MDH+wB-^<6PK$ '$5c"#Vhp8#)hHJ="e4Q8Y.٥l69X9x9Ob_?퇻N(dg}~Tz`:-fEI/OAKAxb'~('ْݜc|zbgDțwte߹ZB%> F[]\÷GZcBVB"o8n(<QT~cpN&Od/FvСAF2N]B@RAzOE޳m!&{s>Q BG=\LhfYSl@f\]>fXa Le=6B_R<;G'iʕ*`>z*MNlx>⓶{#V>,06*J/hѭײ0Q"11ekիMvX d [r5n6?,Vr=J9)q|Gcҡ[〘"Ҽ2֧dtGedFڠds:k8=p`Hf@_tn\xg"?)^؊wgeH@j m̀cxSl,D3vnYޡtN0m!fҾNQKn ]ȄPZ3P(dd>jw&mwvV.3Cheސ,qEq^INs7zy-x/_V2B /oᴥϏ4*-;^=eRP*j|(liղ2BPۭ $o4.OcD) Au,T(KN6@,l i2VZCȟOV|{6+._}jolVݒ5W[_"AmXb6?^Lݗ%!0ZcZ{eM{Ne*g*~TQ eVWTJMu4L@V)cK$iC5#RFy-$YIhbl4`R B+Et,FAT!7DiT\rXd$ RO$㞀6LR_8TH\ÄsBS&cFR| GKC|n y#-vڢք~H%aKh㍥6EOJZޱj.y҂r@x"B5퉂a٧iTV[{%gw«RV.uh&^0Ƕ-P CtdȰ< G?JtJ&}r'dDIAgFbȊ)iZ? 6\D{+ )4Ezeֱ¸:vd^WIQ*E/z FhgN^wu/x Lp4ycƒ l@[a2" \Wnw/t鵅xǕ"C> .bH\94?nzr6* !կP\ᅎ548Adtn&2k`lp(~||-]] H W m^kDeS"ME<ne(QJ0"e]VU¬{QWVŶ{PY&" *{PKԸUTUUCAlܡȱwB mG d8_#RQ /]\dFA]KEG^ohn=~<=C 5l'Je+}Yge+OWuʂبPi{0`٤ZY%IKFq7 4 9ֽ}t7se>D06Neq>~IcD7'/K玩&uR&«Vu/ZW)I`V &PZΣVVW EMs5 XEXVRsrJ4-F_۫OI0gnܝ[@q4SӊTZA2X얙Z_\JNzlJwNd$ؠjKb`L# T^Ee#eZ'B fjm(n̜  ɾB 3a8V IP5b{.p6(1@j`8EJכN=rxf46zMPTE~!' 9aPN\2g0~ uEEEI8X'.FG6:l!JpѬN&K0 Nƀ`*rR1"ֱܦ("VEQzKH, ܢ(1EITAeB%9-Mؐ^BueE1傃)ToVt3^zup?=Ji Td yz"Ŏ>ʉ#fE9?F}y=H\K-\@ߔ6,;=*fd\\99 =8ƒ&Y ##VѶ\=R][۶+yV*?&>RN\I^YXXH;9 5u2zx4h4b5'Ahd7ޔ3{ Acq\[WSDTWS`tA2Hv$R3jƿ3kϰ_ R > x3 Aka鷓/HCǹVC@=Z6x}d oʨ՘|=9t#sT@w@]s:ş{⽇hLc9?-T6Um`<ѥ**VM.ӱB> >s)vV|̑VdcN 8!mϟsȰ Q7y 9 (\!4y@aj:0dn˙RS@Uy VҘ\Dj`Qd7Z(k)R A 7@6ji7"腋@=(,oN{cjjut?+74v}HYsxe΍&yy3}PUT߀6rd7{v8i䉆j6$䙋hLQXxQTbRZ]rWU*y |{"Ϗ\F.WW&#u,eڥԻnMw8JASPbD/MƩi^Hgb9WDEPjj 9e2hU'm V3n׃"fS)3}wS|΁CdZog9f`w/ka/["rK^K6/adƨB:E ) ӿ_g/zjV!/MRNa^CC{z uvFS‰t9/]н_)D"֚MqXɕa RdKm>H6adC8t"ڙPD0с'dC%@[do=s|æ NX  K;əq&ZVAW1wsexNVA|yՃo;TnɌ'!֢TG$ZuQ~i eϡU2}0*f+,<ɡ )Xc}c1Ҵ hR"D/HfS yž@ZB|H}5(}G4цN0D  PP"-܄9 k 7 cy{,SqP{qVL1n2'4l Vp _p&|z3-y}@4H Q`A i BPh6E\42ߗw:S3fGmrߍgozXo8u/?Ƒ1¼{ ^F 7t_0F5Lqb d$2U߂)V8QSN&7[0xOuYL/*Wנӽ ֥;,f#֒ZeFH)q"hmbUg~Ug~Ug~UgU! { A=OyQ~DVz!83X9^u~d;Z،3鬬LE 6Dkz$WeVP*!̭!y,d*% 8܎H2فT {'c_sӇ\:FLBJA ;(D7T#t= r /FUAlǁ508B7_ޙc8!'nnGBhnlsec[w&YqӐd%LS!wdC/|l)ȥV\Q}XZH /a OӦ*IE*h6 Kz!-s˞-Fvk0n!MRULp )# asxbAk74Z߱wuXǼPBXNηz?J ]DR*%ji㎇bB0kMeOpEN2_pFu1h`0-9K w= @T6$0q ո]7q~I$R\o.&V"l|C=iump}kOxfpc"br˼xzeJP3'u]f'wsE0XdcHxjra(8ʍPn9 !!rOfs4SBdr@1ƍA yLQ RY̔V03\lbC9" HQ37#f@\F VS'I6#f+% =QorK̖A3{AdaJN  N(M N(D89w+w<`0𸵄z39*T@=*!%}>sf5S\fu DMGmrc57d^2@7%!No8S< @ BlJ<]/C oB/oޮvϘD5y$8u>y?</0W!t3t䨴SM1~~%_k>9\`#ʇ؟Ywڿ%Ր} .䭔)㼡H=*9i~?'M3/q 3xgGHj;V,O-}e"I=NIoJ~?(driG*<]!lkLOIٻ7;F)+JHy(>{HdEvAO@9ۍʌ(2sjm>.3#x8j!ۼEaJD}BW4/I?ՔeZOr.k&f09&K>^oCέ2+_Y Qf%[AAgؚ}>sǑYt8s5* ?g%lQR#'5;a^凲Ѳ%Lao?{*(Q0/~ p{J Юdz%g :BE?-}g6f&t~~c`VV鞹=|Vf,'6X3YC1\K`3G\ފZ߷|!geY+%{m |wtWT.dzw[*uX8rϔ{;זo>ĭf\ܚs|?[ͱZZa}PKV*Ʋ +:nyC\ i@g `p jtMF+%|8F>w_޿:e`*l<_\O-Rdzv W ҧj)&qRS d;HGD@rx ON`-{ឝћ!Izw*^-lRĠt8}^v."/-y1XY_h??OW?~zjݏff>@}8LIxҖ>cC94#:;5Wb BA5~Z ;z^΀=@g/h8.빖=lMO]~$؅lmK4^T^^BHkJRB~ e{2$`W"0/.&9j k'= r2Z \/_m-?/m@.[Dث-hXS,~O`BiIhs |bS XoM6wE*پFxgu5Ju)Fk뛄q5}J<{% _IEvzH@n1]Y+ZQRH|_&Q[dSǹ6- YQ}?+ُp,V^ ,*:,k1exg2m/O+j3Sw5yUѸVCg.w7/6˕}`Y /-}4]KGexz8^2r}NzTn^tMlqseʪeKE4I$:{Ķv5 [*!vBi䉆j6$䙋hLfGowoWŨp6Nu˯.ߖoeۿY1y:X2gی03) .e(CF z,F;*\9YذU!{F(j-ە.{"e&{\s9:԰Ev5J Z7;e] s/=.)QGؖ>i# AGes!:$#VkǪqV٘Q#@=Uga|aMq48pX]meWh?+(1]@vjm zyQw`ZXѶh_mmn[]_; HqgJj|ԑFI8RuRaׇʒM]eM\^P,DA2x~Χvpշ%Jjh, ҜΡ Xs$ +ʍ֖)TۻI?{ȍ_r{H>n@.e8drlZbjŪb~7sVd;_ ҵr 7w+G`2|g*uS_Qo%Mh%ur aXxsCyM~Mj dwD&w?οOX>8I^N|a Oc?yFp! /C!UH(-u0 ^]Y4bVnblt{͔?Ɇ m6QrOx14J22<ݘ)m;һa7oc:]i;^Zc(Q[f=<8^U#hlI|MDžK&8{w6d2Ȝczw!DOW*U_j D%1P\@#DKj @"ZjJeTπ!Aw1:VZi r ǃrކF 0N'^iaR[5!(bjKcZRwa\AF5D/C5ڊ945F; 5J #0ɻ\#T },(FB x)lIP"نgHCȢ 5֐҇!|]9j;P zlVzGv׻v'ds#zBXca4J(p M#OL}$1 y*Ju+n-y:-hd@L gN4ۺu!o\E[Kq* 7a|t-(Fo~g~z9 /?7w g)! ]+٦b<*$fD4 <}l6Z1@N3UU hP(9 BCj,lXհh9JqǏBmk}$r찙YxCH1$<^{d1Hǎ%ݐ! A`-78^916Z\]͒Mf# #DP127X%|^$pXj\i6wnF 朡$ V<EhߋA-W}%pJ7=ƻ MY+ƘtQ龕Ё~3 Md$@G'?&f}$kV:A/NJ4,$ 9N: % 3)~M딖f )щ|RJך^ͧZPr+b*JRzsOx*URǫ:V{ ZB[ 84n\M#nA蠴O6V!-<`Z݃@z|oVK|~!iQWlɆ缣b_kr/2ٛ,?tLզ?g~se+φtZ<끹z8iAFW$ $*B*ç.i"'mLNǷiwx!iJjݰrW~k `s 'Fo@,ܻKCtW2:Lg9P}EB]21JNN4I%=!(P5X鷏+99bO%9﨔t=^фZ_K:O/c笇sK:dtUґ/Q)ЦSAKS|h5cbw4C^^^gwr)}OWZP-&>#wDBND:KTДQq%(4/ĚLɩXGkbR'84n" 9$w9PRI[8Э7+QJaKC3S&MҚwoN"u0EtDKiidəP(n1~YʀkaXrڂ[Հ:hb*?oN;'I>fm)ǀd{VOlg|<<bYV,ُIM7ee9 $rg^l|mK \,S w߸!"<ѱ- +)EmjjA͋ Z7=HZJF5NZzZUr5BMHV.hU[}4c:5gt0㙾)*z|9FmҡpK QQ!(~rGkR%jlddciY) StY q63G+g ܛPF fU\=ۄHiN&7wX.E,G\X훧],ho%遳K#~1m.a xq'=,ǛMC;;7!|;{~vɢ} On-)Q3^>=xgJmN-n=Fnx1+.י:jkdž[H0wҌ $St&h5$wQ}F!l2LhӃ_iE?Rڈ$x)*-p땑ArBTK-{!-FzwzvNi@.c%s@R<#vP I#(T&+43d*R$nU͑K圠O23{*Sg#Eڝk 4dfwN0pS;D3 ["_v㣽݂Q..TiVCǫJ_-A BI{SM?J.pxhZt6a"c؊46Lf>}Lusƻd~kyIV\d}To`"ݬvy|U JKA/˜O+@nﮖkob9B&e=WTv7S(_yc[*aP\Ǜ"ܦpΗ܂Ta30A0 '!+VG'rBB -)]jbO%«|csٮZA*Zp U*wRixJ@eÝi#1*ŭs5VS9:Uxʁ TAZfD;EcN U QD-&Ţ^O͘qNh^|!2 (xIiR '=XwKLK|~%{poC<+a߾)X8)U?O>_?DMT #*`^ kԵ%Š}xϋYʈ=b1]fksG4ܗH|BZZESKMq.AȤ'I𻩾S) x~NUR5ɚ?Jr]jqM*y'vfG4?7mleAJ{qYL|.87:Qaj};C$`W慠#YjڋWvùWdtrUD9 ڃ.&W1G C*Nw܆[|o7/u{bI݂*EȸM[xBw:Kd9aTNcY5z1yGâ$8{-g?Dx \B( Ǥ*Jה>%շIywk5BE~Ofh qQZػES~!-OŌݭD^53U˕렵]uLXSHz^!N8 dryOUW Nk@jJ{r'2S&rJ 97F:HϧW.@G9ukV4Dp&hK}MPv^B6sV\UhPJ[HѸ0[zT.fuo0pz\ GbB.Xc[O}$+6"⌷MO>Vt$noę{^䣻~GG -duS--g36°3u{fߘ}~TFs{bS9zD(BB^fɔgr1gh;SN&Ej.$䕋hLF1mL 6g$nyX#bػ햝ku !\D˔ |{bIךHÄh >A0;;h"|0PK4vRBy-:,wXg;,'"RXj:eټ+P%{KX~ 2rRBJeOMJȱ}SZQëznX̽_RL)~اֿ$8}v^YR} qFrISTð)98dlVbPc2%]бU!=7"P/BsdV[ػ#۷_kdsl<-쬘/w^o0Z Iv2yewMjʋgכ/?\)U;E9며VM B+Jg<>f#6qG9@MxԎG }7B{M+H`Jgn0 $%h]E]$%u12Pk.<#QRUAά,d™R V{Uu"rvnuS1qkR~5*8DRV5qR~麤4b]S}^v3l/GA 8)(XoYcP, І=}L>^3XT'-N~P{&/&ޗ7ֲ)+ =U]xK Fۊv+cߪ *l~&+A_5eUwnӿcϏA>?ؔ{Pօe E$Jc]Ik,_Q%& /X*/Bqn]N6»λeA)^~xh6CP>\Aw@btVH.g gd!DwEń Ql] e5e]XU 'dWSYSv+pϔH.{eRt0`ZYF7j{1:zϏSx~ű58BA_x †b}MgSQo/z=DjjU)z ^D5H!I9bB%h~%aJ~0n$Eaq!3N8 yΓ'=g=8pN=s5Gc489G3[xW_)vT3 ;mFZhשu.!W˞oDO7DI'c;)a&ižw6Fa9ՒA)XUH*)DUQ-߸.,cm ͼ`n6 FN<7X:#:Q9^3!8aSV ɿ(e)#  fsIX) ݰR +%B@YJnd1C"(}]oտah-RUE~BK\2ՆM gK^/?"3rgB(_~+]qYJ8zꪄFUA+ s]0 h}`4JRhfFjޯ')hE@5Q~yZ^}/'ہ˞o9O6'+_L"1wmdNК6l!=i%A JtLK0"sl, qR jqI%pQrgBhхc ȁQ2]]{ΈxcIw;z?[n`oj#33`c/Y~KmMJ4Ͷ}w/!I\As@HאH>Gh5ޠ)N|03@f[G?[ۊ-YvoAJZ$$8\;WtՃq2:jQDok3SB{:$/ PqyA-ԕ1 KoHUcA|T( Bz NgX/ u& C# tD YI]YߚjyDuv|=ԩk'+b޽ӫӉSZSEӭGm!)YqxȷDģj>1+㳂M?.9PV__Iaִ)..C1y4>c;C2ܼyh<xH?9bۖoDA]?U{3rXK6SlrHB^ȔF8@gr1ghSniڭ y"#SͲs8g#j\ bD')mD/q9v˯̉n]H+2E,Tȓ# W7j^-VK2->Ӵ^LKgWMnjqG;W>\NtkR9QyP"B;z$m:4דeq7ٽ:<_b9 FbtGN1Bkiei T8':RXY;IiGDNEd 3N!cpqXœL~N'a+˸hmvkGm#bq 6x-.o5~h RЈ>> ZjDez>3e{ &9G0*M%g{mSSc..3[dzD%:q}\tr.bjݮH~mVp1ϸ 4}׏DJl-o9V%!k``YM[CW㰽):ꀭRU* JxD | B*hCVgFB#686oLL0z QG5H3>dhOAP&!ʄtŸn]jW_=3wxg ቹx>$tMct傠/DSB(꺏} {7u`\컽ӛ% +<.J<24iJ~C_FIpHYO]\y"1`|t z+۾o|܅GHҾ19/$)]xoAUR !+vȱM9yr͖͏i՟4$Zc5\1O>7>D񻹉P?,v0L+M}!NIk騰.2m  GLF0wKΨk#({Z#0TڵƑՠ H,p IxE˼!NZl;ey3u6|ܗ:x:I|'~.`d_'ԊPҷ=bKE η0Q>a,U,<;Y€4w*zUF$_L:h7*_*E儽]' Y-IV#pwmEVvhT 6 E_ނ^t$zj{6ʤ;keZr37mk0@ cyv Վw<9ռΒ'źى%XU^9F+nT'C4}\r.|3}Lr?{̪'f0 d/e  hkw[&B"-tv7zs9w y06nسV}yx&Uۓɤ"poP]zH#/O[6^7_k㼦 &-zqRIb 3<eq[s;^zr"z9(M܅JvE1BEmBaM 0!Ѕ{hJW>$|Y}iO(aXm#ߐLdvD_+f!&Mo+!IC7x-ZH$v uF+)77WoHՉm_j9^3Iya9>Ʒ~{+cxJKIUWp9F@I% Afȃ\rF$ Q Τh$yAF);= WG;-⶚7:'D܎7x{+O"W;[|rm=a5"3I:, *ז LL [WU[&gPnorTq)PfɂڃU@6l(&KϘ/}mCETx*"Վwlsjv_6[\@ O}]!.rr\]XbBߢПLb#'uWy_]}u]Wu $EhMH5$c,XR9B`* yj4؟lVdۗqՂvsP"1g+ZfIr^ c+ ]~QL*m%][k2Im*DNHK4tP0X¢PP,&b9[f67\:\ w,>b)3],E.bYQK¼-tИ*,S%HO_XM@NmW@Ê#/FG˅:ހ f5 r(&$TYSq2ʮ(>vcoDʙd8PT|ytK}[E&Ovp-eOHjs-$'gx-p[%#,j#x:_|us>5q17 ;LbTHB&DAXZv)+}}QGۗ]qDF$Q"ׯHVp'\SH"$IrI xh@ڗh m?6rmO^u?ѩ@5 yG cKr !J4tߑ#eqsKpȓ㵿^H G^F2_H"M8 C%Y]@l@XR,Q ]E0/" *:|od5?-EM,jAwJ+/Ҋ4QhP`,_%LȺtI:VkԵcqX+К] ͍f=37NC r{zs?źOn[_DDDCb gnzFz.F7(hE =ЂÚ+ Fd4aHn-ِ 7h+Pʥڤ}LeN \eHJRy2&<,4hOQuW<{k;逝Kw\}4e??d@G_h@E,akNy5Mf*}>)g)AH]ytPqD ڠĨF3tlH͝UA 顧y2Ck!/NŐ:`>hAUWX4͒sdM@ ]ڊ[q+.Xt%v u X:]q@ >N ,9;q2@G~[ԆbfE8:!Cҿtf"tGl"jMB\(Ƨ/Pu%f4G8]GlN}/kdxǩlyJ`(qJD}nA @g BCt1hVTW,TxM<5Ye/;Pe_1K`k!)JI)hoqK ዢNj.I9qħLxF'l8]O2#{WY׋)1Ꚋ}[}3IɬZKԀ'3+ 24"Ap )= vx֭ŮZd+\]("2?d]Gzm@>w훩x Dlv$Fe=e1QOY?RMc WRךjm\NMN0> 齄#`Ƀy9vD@KUbl4儰cEFFxzBNx02sƉ<ƽ[Uz% y;5C.,ܥp \$^a48Uqleyt"BΘU9FؚμB#>F"W|$2gX#,S$w̆t|B;f3轕^ck:P >uGx\qˏR3Oƺ0P&z>\[pFE=s'T1\q`| (- +YԿpޙJ,k±q=CÆN/ 1;ΤĻA" r澻At2$3L[X#1L#XSf ( c94+'IIɝ3-w՜|C\0*$whT: Y!:jB1/6;mT0hΙgz~d. %c}&]Qjf/ZKq{h^T)%v(Gtb\&In?d&G\63 d:wWlZ3󌙈W6Q%,I׷]P#I a3FWy!/V$cBmTܠDge;U׸uA#O7Y` jmKE)vҌт3FU[qMِrrS-Q"(i 6X `}2aZΐ"3t;N(F }d@2^_sM(*ؚQ#cO!E;8|^)(gچBRj0&RRm[PZ4CsdA5H!= ȸ{zR$Gg gجѧXIbaS@*c уVNAG/*:4;q2GSF0S79B>h9sNK6ہ3KaxӻۨEL(Q?-4g)lus\΋/}D@r9RX(r)93ԳaBW*.-+?,xBXÞa;ب 61(K` !iKc >F]nG}BU6!ތ5űثV\mF$;Qq*rtgbJcp;,c BNE0-]:mZwgZ۸җ܆4F*زT%TǗͩx'S I~ F)h8#QfʻEpF@f@79 1iU19řkvtR0)- HLc MHeVI).3g1r Hn3B6)6[ v[? rd>S쩀ëHʣʊL-ԕc Q'. #spk;Piz! nPX/\%cu-FsSiE?i ~n>,}Εc7>]ElUnNS"oˏ_lX^NɯЧpVh G. QF4.+0/y%tJbWM> d$Nr~dr@i8HH@cs0?II?!֓?XtBg˛iW]b-HC-H-HoA JyZ+]zG߿9zYѷGEח~O޸Oxt19w25t2xh%0`\mI 1ebqlZY=ufȐJ9KA'BP)L(/ ju_ !hgP{T@;҂ (Q[ҵ1ecН5,uCO`"R P)Hr 6\"HL҆!{Z11H:n">-<C &#X >%Jl-}CHCZ"fqEDJKP2jOΑn&dIAcdAd-VzL fإ;,ﰘbogdi2̨2)d3Kqcd$,+KLkX Ɉ``>B\q4 3J29cQR`"g$ %bbmVIb| :H D Ii@x^h80d;r7笸$"h" d˚)wyEDJ>Hk#a!@I-m i{|vhՉ iaZ-'chg>b@Ȭ$9IaB:r-w1x A 1C^I]y/)A .1I [BAAXƮkx4`]~XPQ#xʀi/㏉fxNcPC똣_y}4lë#\~OƳ9[dB h[O/\_ #Мh{WW0nzj+c=RWHl@D#`T<yH d('l&0㦇b6Ʉ*0D}@oD` TX{jX`pPZ4ȰG ÷ 40-V|FJ%n0R5J$C"vOiUlg ބI#O? edKdjQIqAȱDw6z9# 42rrhȀ^Ķ k5M; ;Ru~ ;*9 $!g GNEfɖK^uۡFif1<>siU(LkDL)dvZPPӃ?ͮani4n-Np:Ug hsҊ盧̖lx߱&8"W^myοysGdÏni~~Bz1MO3oB+-Jws(Tc#7wgJ#r+k w]\UgRTP_4j& DЄ#Z)v])>emնM=[ hbW_DV*Ob[ kDmzI+d=k$&jĠM 6!ۀX>-RU i?(mW0ȷBHG L)-rҳOb Pt΍,ȊG7sH?yɭ}-Jmwނri/tZ-4@KtvF_bK>E  O>|8%dXxCWPyޠ|N@}B3RTW=fDoAcV͌rtcQ^pD[Ơ;$nM>(7dUu$uuӗS_2r#y?LAtmp82k# V7-SWOt@ZƳ7*D+,3⦐#wx)NjPy̱lRL4m?V/͜mZjy>Yha&OSx6=6Vn \ZeZ ڰ1{& c5m"xJnh,0ٻp&ԙ^Lc Om̘3` dzGoϖ9M=Z/w٣X_cZmvumGݘ'Rć(݉Q8Zl7hN袏!Q}9:=x {\ X .TN3wCybqPjy7v w] { ߍa򱽰IڱZ0q)59VVtlZP]`Pr-jOI1a8z! J*| .,ǻ]+g@eT_hJZ,H4k ek5>k: ԗ5kAwm >Bx#1CppP@o0 U,N2(KX Ey:P.rs4Fҗm`EK[2 OJ]C8'݁OJ/Ȑo>O6;_;ѱûy.g|’'}j+L f4~2YzӋ[ߍoFm÷iq\ZLŠ'Ƥ+ۘ6w̳Tۅ!Gd+TvQnЈ"삘V#&Y\ 2͛iOn~Bg;콧CQ-Nlw m/+[ѻ{o` +ɶo{8.98.9jNzRLK2i9jH%P<$Pv¨\IVnۉZm;ɇq"TyTnf(IFrL"'XUQzYY@zDp9+4}dbᎌB#U|Pʻu3 ^  1ȕCMOW3%og>Zlt#j@FY@~~vJI1$y3,2=iLsĤ)iԆZ!hQ 6Oq;[ ߄d*VcP63dddFLı()ȌA9|&s~W*;| D!G!77 xq6.|ԥ7z^d#"1٬fˣ~3QOϿ cY J0Ķ\~H%)zWɂaHy A箘, {,Lr) "&$=t, F1O=:9dHħL?efcFi0R(ɻ#7鬂lɷ >/U>^KEk+?zb9:+@H;ptU<'VyKdҬ4dߒ—1 __{w?[ϝ/}־ &gUAۣѨKa<wq%J9|/Έ ΗؿX/!f/ 9xkF b1I3Ar hTJg#u/>ywG-@ /"s0 pC:/w0e\[UEXN`J:|p6=?Yj {r,ȫ Z'^n]_D>o~ghl4,nG˅^%E])#`^Y D20xi`=;E  *yJgYޔ#DHc7//-NAM_l4^K3"h-M4Rl'X$"XQ0Jz!(|EZޜK34%]Y$+yYڌ=b / rLLWK_Y̫MOvf`0d|AJ~il;xttтJs;‡ Dy4XuW .ejSF!"FtA.J Mpyq`I,UlD^vqaJJHwB!k䖵#勵w?FhYi)7L8z HIѬkgtW2Dv.Anhanw*kkZSVkRDX V%GH_d=_TݓObΠTEKnf'^+7?~k*J+}n>nBa yv0Mj.MϱG  mJ[RrU\^&?қš7 gXޣ{ǜC'0 U6|V#_zHFx{&~ۇ2/=ӣ߳Z}1?~-?$+B7f#vzZz3/o7t$M/qR"aݮ7cviO)7>s{h/94 & ZОF+ip.VJVmZݚw#.*2FU&h!iHL+4]r*FP(?nоv>$a`` ALE"+S.9tw˽A pw[BMwD(;y=Lk0aqۻ m$'<)UX*FD<{0p4lN9_jYw{@S|=_~5qa|sOk"ZVBxf1'ݲ'ݢK9v7bıE\ EB4Q4*J352' ѹ&(֤*k5v\&_{WHM@=f$ .W.5yB˝Bt @0CАR̽HCQJ1Eh*uL:hjs:N>ou+:͛oŦˑpՔ[7|Љ@ܐ@\1h@ϱ';ʤ΂L(9U@nH݉:iP־G1qD1=eO;W[޼Z<yZffI1ofIovr?4%,+|lWj ]7íPy@嗇Z(%لY\"2K?忿y[ߴܭyc=~R"' y&ZcSٯpޭUOj1(1otnE-IE7whwB޹mSpンQ> x &sxW巓KE2J~u/{b8h}i}g>g .rCS~Vor4Z}|K8z1rF1rǎ_?Iˈ{=IVG~<=%<Ίq&skqWg$fv4 vj( ?Lj)ܰ-%|@,8)r B &*Ĩu4d$@E 9MwB0#VYN3kA;< .kD 4.”O MP:yoFH<1^38|ÂۗG>$@Ŷ0. O9K!8F@97̸r6rјܼvB[ƀK.mZ8T1ō(*%Yi;%( 5H^]Fc4)%7b#cvLIrH[@ gؕp-%or><Hχ>,䝛hM 6nϻilI=wŠtjŻM+xwZwnU6%#}fZezZ JL]۔Y6s[@K[MlZ+eznK\VfV[7iKi@iig| l7@ԬkSmx-|wL7'wwzޒ]R>l`/oZ>{lU_ ڕe_͖1MҮVL4}@rT*٭~Q JFTC8U%ԁQj6fBZsq@-RM>CnnP8Ԩ"$?G:97W>jh|4(<[ldF26]_J_:E۰Ƒ^$Xf²(]>'rq@l@| ; RFl_us|ѹAn_jX#1CǾQ_˅H8h>Dڶl1THIB]eZ Fwu*[ q''W3:Elg{>QwOEޓT* NQVQJa5|F=j^^ ԢZ4(gȶ*{Jf>_ͧWVUr8Лtx8*hpž5 pjn;66w/f! "0M3Kőo*ŰuX}FJouiɭB.9f68ej9ԉd0N^Ph`PI >mMJܯLZ 99`}v_mxEw6~jpeՅ7L>ݘr}uWlgZ|l?222۲moHQbH!!吜bt 4/XG>;4(9l~`;QQ jR4h)l)Em 2ܒЅX B6 #Yk+C\cqktZ8Tnf 6{gy8hyC1ͷ89JBj`s2bS^78x)S$IG8 pH^Iv-δgqn i8ސF%ǿ4u4 )"o4,螒T` l@lۘ#hd\ |M6O?-O&ElܤFI5Z烎+ oii2+q~Xa^*|q\wL׏˨b)n5)0xb  f|0ҤF\bNb'7tTfp̧˿ aJc;oc"<{|?Gvw6wwa\W 01]|8F|&;78 5;݄&M/wа!d_}=tRdeU{~|Uwʷ'ro~ɣAE}bODmƭCD8Ϧn0!E)nb|BpNjX5V[pݗPĽ]kOp dLWd }+R(Бo\̈sikCkP[0B !!?4: F?Bɋ߾[4.{='C lSE~Y/oxes.;+No_%Z qj8kuqJOʍW ^fJʔ RP)u)zA=t.z0F5 ? F*Zr4Y>sziȁuJ u_0ɗ :Ļ[wK h&\Fd/QRѤ8򍋝e4 ܭ `@9 I3693hf}o#Ow:!co &{0`jK`c$;TݵύMf%6[GaO+n﬎W-\bߛI;`7\ _ZcJ6X__wԲ7=s̩2˜*/۩r6GC(DF+DdL&"2B;lx!pb9m퇪ok?c͇% y"r)4}YO7-p1sFLk-, F3-0Rܗ 0?b76HdEAWt蒠ۣ+uUQ?F]\IdUCbփfX0Hn"'U :FK2du$T {6*#ub3DY^L{kR&J%|m59Pkr A ^0Lm( {Hu) ^(݇EtњY&BOX9|)Z98CC ,%4z",#J(_?JE`q,}NAF(1aiM"et\nkB/~[&:#԰-t+%8&/# %U .A<$-YDI[D*q"bQW2U8+v]7p]$`bu\*fϗ!1ˀ3 6׵b,9W4f^I\ɢ0 )Z.=Z`X='b"ጞHurW]e)g ~5K50Z,=ԛ#;-jon&d6'WOA)ͻo*+Ϙ璠ʍsбB*4eSD"0&%rB*lb$X MX @G%^rrC"N$W6" R&Dy45!,DAN2Xu6X)%$BҧJ+% [JA("1C%l㉇JRVŠ/͐֒*Qy0LL-JiUide&RR2Ot9g]w'fyXeY޻kDJ >˽2>FL>sAlQ8^Q!k~\; @-M] a۪J9!1﬈%f͂ ϼcА3ډ_Z'JuO ޔ[hm 5TU_bkjY>}߹՞^6j'G!fq묹un`</o~PNڼ#~Z+ޠ깇za\KU ;tP IhP:3ZF}U8l2LkT/vRk0P>{xAdKOG{A]S? ' ^r*: 1H,h8Oz;ÑaI)w(Zae& zAa2Unke|+Yl#5rOj8b~KH*ٕ NEI F4:/)TJbJRIDr eqU4xtkCE|*w1ʉj2$mx/έ>UQ))̲ޏONd]euU4mw=w! 4+\q 3(<FR!VUv{e:cC|t| a6$`-.ۮ\q֪Z0@%  V -1(ly i c *"kPa܀h*øEhH}.cg&l᱔C\U,8{q kQT?kᱻc r΁ ˄(U;4ZÔAW!,>X TƝFP_cV [oj/q$շA['%3&\㬭]/N``4$ b0 A$PL8*Td=t]e YS#|LP#ձؠ'LQJry5ܝ,¡.`RA!"S.Oa&rS/2r ):-#ĂA[JF b$%X6,dTC?AZymsNzQ"OwÍ{}{Jd$+YSR*1·{EkƉ-S./A9;Y [ RR>#e-dɸ6_!1 2F5$XDHc(4YYJ\kҩJR)^.tHbM%`09K1sb hi VYlbbthJijDBAS]te6$W=)vUEΝ:ͮZhP]̮@u)!BcE9@cWUO=TSsFKi7L@,it'qjx+|q@HӇtPZZhq-E %JWY#gIDyp < ,:̰-bfcLwJyުH^$lFsY]Kp^'&:K5d#%l' 7W<φQ^N#@D jѲm^tu%%/#91@c.03JP PfWc~}dscE'IܢQ62eK^j65c?.ų4;?ɻxyYNA,F؍64a3S-†hRFf̐X&S3]n;O Y}hC\On!A~yeruHBf 1c QjXn+mbX1b"keZ% 4'I*@u{AB[JHֆ|9e uXIs>! ".Qb H \(&8F-g2s +$G2 Bhd.˩̦H?Od2N6b=bױt͠&-vԁ'LNϡLFn~^?N&1rǷ L=[{NR;ճ<[d.+?ghFȊ.v%{ξ. #o=s~4hpE:E$QʒD8;䍰?$:gu4c䙃Eۥo1"Ψz{ȹB\v8D|mMBmS?̦Ն!&,05ys1[ퟥ)k98#RyVm5}* (鎅B܌ŨBX֢!qNWW$>MQwV)NnV,!X~ێo)*sMmj) Yzl}uSdNݷwmaJ0ՙAt2wTB"B6 4@8D*pIÉR l"9UY~4^ؐj|)dO_1ɑNߚctkf?yko))U9Ed@r.Z?sy̅Sfar<>΍$aܶZ`ly81Ƙ` E/=P]Ѱ)n:|O>,ڹbp}ppd2",(bzDnB$\Gn "q?}p+77_r2Ɠͫ'~:R]^A [o6†ͦWPM\6-JWF88-9$fƓSʀ¼ݝ`6k񼼜EVs: K9+ޡ( XF``HcD4%nxZD YidILb& gx! <\6T0WUJf(<?=u[\YMacWӽj?ix U@qjf8Z9J>L2VG@7TȦDt@*LTh|{V;%ues cKeY۷?LÁ9(Gukk j(PܸZ:f Pͣ݋ ޙJSp0! [m6YY5DehF %Kr Q&"_cN=Lx3Pc3T?;wCK769Tow=/vwK;{:ގ?;&Uhx7; rx:l ENż>nfZoofX|qjSq"# łHc@lv?yX%Yg6۩f+kv _Xq";"_SF5sMrg97Kgi|R!+~~8z>Vޞ'=[%/Kؾ?.w^-Hń;mkΏ͞^&_\b~cXo8}j7CQaa4v̩'b&{5E{u $Fqn1BX @'kBl2I'uG2@!Y</3] vޝx E*1헾>$۲|Qa;- xIl=XU*iTLI8Y֛!k_Vm! I /׏@@]X~(KZ==k/6ˆ`!4œ aSJ<23뗼~^/qzZ -oz1Rޛrފ@}+&mÅ~SAx]ϫ4]`[x컧͖^jzt-fկÏog:ԁ}X3XSN1 )H륊N#S;(Ow`yƼ5Lɞo66L6)l#)*s}!zt> w?aB{=M}ɩWYa73kϡx! EAHRpsW14zb"/?ʻGg/''sOG@mF+MOnQKP ꣜(H}2?;u+ϯJNN|[q L0{T={dH gBVRƅU_G/ ӂ&¤dzΦk¤Ua.d0>&.X#7 R8^,g?77 K</OB^WlRi x`#ѝK+8N62݁UHMj##ɸz\\+9Wĺ6h(aGb9v0JzC JK5$0uH;H͘7jy_ ZysN%BcʔN@ ӊ5BMJo/u4AgOtle#faFJzs~C{*!-N?|Om<'~'nN=0J rOLSřč-祔*i_ 2XJl~n7DigO4)?d鶸IhFSzӠod̷h]b}q,17mf|eH@2k5gzj MtHbPF;7 TiC\TpG-+֭ӹsٲ'rF`l|й:L ÄP//ׅ{%R@$4HH a1RXcD=Ɋ4f!'<9CBe9HFm^{QlߵY.)[ȣMR[ FOJ#F5~=WQA4"yHD`bEuCQD;s P3shFR!A9"OC:&rŃIDu|> v$:b"њq۔d.XheB!*u,6,u)`nGbƞ (2w` ug43Nd-b@2k= ^c?@D v(@~ 6@kk @3%R9 D kxR˺|%bZ1h# ,IE;h!Κ(jS:!P-UXJK]5g #[Ť\w9+ikS/ʒ҇-CH-P)fkp$`,WUSW.T϶͠‹s5s>P<-nBqD@SClqA?{Ԃ_h*oI<鹑ڏ\.%.D<Ӓ0=dUVp{׈rEå]!ntlgY[>Ծe?ǚrK3SDui*फ.޾N3j1J+ k$ڞG&! "Z mRHTiIϦXP, '8v+aa mT$uS[TZW6a;7wF}h"N9!/CqQw.%pt \ "Xu uVǥzWCcP݁Z1'buK'I>_X8obV}_@L~,ݝgV sj+ucVG)lG/{5l@͍ȚRC A-_^ܦ:o%䀜 NlG6F䟔sTOp@Xh'mz&z1ǟB PfeOf"yZ.xc}322ee)$0̰ |SlPɅYވe=ԄFgш"e`=71ismܸT]\KHT*A L-f OF4ڤ)Ԧ@Z#Z"aRycԅN,!-1b;I"I^LoSmI`S `˹RT9kSg~HZ&0BO˜a0CU MQ MVڎFg "8Hd jG;h$)% (;Ǝ ca@<{;\H.6(.hx@plAF'(# g򢇥I`#H'`P Rߐy͖Ti 6N8 $ŜL{+Ar>ԊL!V/0wc6"7Ǧ,LEDpɇ~u`)$c "`1ˆ.+'ئ".KC4IPځۉy޼)y9s4hs36!YVӃڡ:5sVy1K@+^؀=7?T7[mC)(01\pH|qypQ<ꏯA( DI"ioSϒ.H0!nׯ\dc;ajgR߬w`/mxP(4V-Mt|t$qVBb.`D`Nz;n޲FБLTR|Qt@/Sީ%Ih3nTJQb%02ZTJk쐰&*Y8:&@߂EIx+dHI$H%Oܨ}ɛ t(kaT5Ԑ,?^Bzυ{%=k?RTPSDp`T\*lW03R xl\p':'kaSph_Ϙ"KJ' ; an1 QެgϏ9ILflOkݤ8zf2xie1/hGlfaDyyC=1,ӈQ3fXTKC;-M2gɾo']M)Cm1 ;\=0 $3Qjb]J kU& e^LL gđPM”t3%ªJF6|H+P2 KI I~T6DTu6D!lV0휹zps|O)V&aws.xNjaeq6(#k_,nnh  R]!or^ȾZ`e΃eQZ]E=|(mchk4GkFB,ޘqQ_3 `.+=.-<"Fh8oDSxz)*e\u?$dPkYo3J+ssW9 z9 __ξ6DIdG!L z(b;ᑑ~k[8ll69^ȆfJPC\h$<|HSSO, T,BʢQg) |4zJܷºeC"**; Q}mmI+Zw0DIA]]deIhdӉ[^ܠv݁<:Б0F=Ċ,+;{qIDRb,O34IZR~`nuxX 'sf4®n>;({_T .Wb(XC*)q̕s)1j(("B]C=CN1z.Z51 pTmLuIC< 19?0fS$B~HH1<S>X a6J zKS078;qwemI 4UG֥>JzW˞yQ]5 (˞Y lQAB ѕYYYYpӂ)nNZh"B0Έ1^R3DHe JR8R70J'oRVk8mJO)YLi;ϯF/.4OϏjL[KTe:!!L߷ ~ Ǔ?\\ GNPBg@ ;~7ږ9J?ތFx}"pm(B0=fW7}9s7}jG#j;G[jJf.v\FS)AQFv]J&.t<\BÆY wfXí#mDM}b"HS:4D%5'}wjyhB^[~6W?z|1fQydB|L&: ι련afmVQ @Wǿgm? Jlley #vr hiDρ~:6?&Sb8TKs5|DPvc  bk[OiT:HEKxxN~+w@Suŵ"g|XH>J3a#~"]qqymCpy=Q̉/ H,`얏*Nm!gk {3"&5͊d-m`c_&2j_ [ѴS]fT%d Aa4Z(nE$TXXOseR(\C%+#쁡T C5S|e֊1^6ybXӰq{QFq]vcK*vuŕdpYkQ; ҔJGz$]s=|ejs';Kac3c|C"` &W_&~m\L KS eA (_04IKbwǖN(׏b)zI'(ƌdhI'{.SPwCRSd6lz~}=jppE[1ƾ[k)fSl`rC r)u7c }\\=Mg;]t,!7|qs P8ddgjy4g[ 8:|u U מ+] KDVAbkO3g5^[0x \Qmۻ{`C*x4S`̗_M22z#JW0JrRJLbGʴƴ?9c=?#(3Zř=d&+RPgSOO>lβ@YG\ u(Pk `q65S/>U|扂=.}Ӛtn׹GuI<2[NzC4d/ LLcA{^8AyP@.1lS[Li|JNk|B]]jSFȿKb]'j'Pnzp+V]}B [/[|tWv H(j,J v)$#y3r̴n@rp SDch.kkQ*/,ϕ.%n|~?Oӯt[2TM4=bˢ H{D׷s,)YRë'q<3qL&0!j%A04Xrq\z<+N%z;8Qp8h;A:f28/ޠP\i5`zj>jJ贁tYJtYJӽu96LD(xQ4FG (ށQ¸E1ZeqہZq-(|+gcʣK8=T,9^,Qr2DnJ<ba B>U3R+x0Ixr3 bN&FǨ0<*7!5XfkL-A:ÔԤ%|t3-wᛣA(O3ULN͈^2;L #R,Gj2(eǥN$[\DžN+1ƀ 6RXjTV2! 1sQX#]3PSQKF/PD1ZaAX "j.0hc^3b4պLǐ;=DC!R !AGE,զPh!9|Qd֔t1JzIR0ZM H'+}l #\=5q $=(1>+9W*u"1AFm((TPeeK{-ATpr`|)%!xHC('єB&4YH>KGg1B3@P?̪ }_=7r [D30?}w);}{ubR<食8׷x A^ߒt?;©x23䇛 ᧴~~߹wpr9f?ތFx}"p m!I \|+1Aoofɑ7iM>-=Mm[SsSQYFIFIs(/A; =`2A1F1fk3j\*V4_!r[hN2bw#Z702{hA75b8.) "(K%.5W BʨVl?91LewdcEq=~µ(=,Sz01{X-]˓uƝ ϼ9$, }+NH @(G r~<䞔S&JŨ$( Z ZhkqcdHb=ed"RUI?Unѥu}*"MVnl'I[ CUKq .d0HCd+)]I1FU|>bYQh<^j, jzuqgo{-@^1CYۨE?;~hƖ@DT>Lx6f [9ys *vOcѕZl\[3\f$y-3oIF&-C>fm+w( FWS4>|[ZSkg\pzNCo8a+Zu ±p> IUE+JVUU4@~j/>=(Ȃl]YEk:JBgA>eJo!'Eo2D[VҠZBW;ҧ\kֱ5"y{ r@:jc='hr%8N#@l*]rD kVov Q!N揧#w^N6P+ľ\yTRKm r0/@ c|u||f GǸdWb3q]\x : |:zXꍶc(O6_I ΘS䩨TpGٙO3;T}z}/ '>?o_Uo3[G ôYSGEL"]pM^ۣEi;FŨEƬ[͵n]H+C1hl-mұ {I>5r6~ͪ2eRw&\ϫo~Mo;~s1,̇~f Y97xhk UYDէ T "G53k <*Z ' VZ勷EE~0 (^@B~Q.::PWхZOrxEbdWw 3+iY .-71܇,[i9ɾr*3& SR$|ScL2S%|f"%0!apyM^;]**]i,\Ŵc˵Jb_97WcЅY }s=sjagyh<de3M5GTZ{Ysl,]zf\7RK,/sx/F|<~ʊ/{+T1y.$%z_Yf d/)b|rSVbh# u5cNmL^"x<^[8)I2E'lu,:tBL4MxLzGkI@\S k 6.5H\Mrd.cc\rAJ9urs P͚6A3J0mU( 7e4LLr,AKB%bxl-k07IۚB N3+vmy0#-"-](:ut& ~ZH=eT8v ek@Haч<#&m(]r`ݽNj 7Ĥ\$x6- ONKslbeլ]u2wJC_JX͙e4XKH+ͥ<%;.Ymj Ot9ؘЏ !L/˗_, my+[E_LNjg ɩbE}boe2=Eh{h6¸(pA`*Q\@lJv`Gl'g9ZbLٚ:KpY1kjxq[I9P)ŏ Ǽ'\ ZMzw w|]s]6=uUe݉8R:iRZ~F/~6&pBaT61.ubYI|iFZ|F6IWcv'w L%QjrjF/ HLA}U!~iX]/;~mHEOD'=)^bHE>_f)Gɂ-?…m,wqϑO p'01b,HG#pՏ;\-̉0~q(B` z$bJ++đ6RHV@p;v C3˜VL zu*FstO&ƽXn:s>̋Oi6hZnSӲybYt?{ Ƣ`͟9}vC5A[SX1U\NDzj9 k0q E}X#qaX(ע? 9Bb?:TBkp$g됚 jx0 qS`jO!L"X-rVqP$um#>KˍZ`&%օ=1`<.D/r 9I6Vd ߮F?^E҃0j^2xuB0FnO̩$=ƩvA>j8 .jPB8 1юTU&8ni)hNL^j T%8`tdTh!a\bamJ 0j. Y3#e&$Y pbd.\*B\$|%"("ZZ[SvB%8n"|e(8^7VaMZZ&HP4X{ \SNTc[Fo*pOj`p%nJ3^ 2V[ 9tuS7Bcֳ0 _FRH0 sjq=1^>F1=JSw0 X.qFz5t;7&yT# SKl-ֹpV(= '"vfh]{5@1$<.?lkJKUǯqs-[R~ŵu.fi|_x^\;[™+녳<3P{o(YJ(p>Wps%j*Ert:~Ԝ$Jj/4ŠG#. ϥMA$H Ih"QD8@0QRK<7MYZriSsa tOFѩc-{~2wqnq_dMM<Gg|OM_}hzFl*>]C}t6 V[SsΟRiuLzts.X,w,Q ƧhɜbmuJu+ g>u;><61u+hub|S9E{o^ٛsXK:R}wyN[PӅ5 ԳP]+P?AXb[}~zzy3N}-J56Sa@ A eץRŹbt鬝FJ_у|NY,S@U生Vwj\6?yʛYy(V4dV RNjr[)sn.Ɇ#jP+.1rBLb,p_VڠTwK>3 30s+Ā_ L!8~P/$ƊtB6hzoMMoJ-DJ##%e` :2rQ ! PǼg*/}3pKt@iJ(\;,ea'C }_ט{J!X!u81[ &B SXZwY%tBo[1%(OE@R)Y9LZojTL*J[GSD hQ{A\l@(hEQs魮N,rQW2o(B<@[[oJio 5XϹÿjNuM%Bj/cj Z1^ᖙKX)Bao=f,%QƲ眝YC8*ƱyїԗYz;EȉA~@q^ط}UG1'oNhnvKb <M :b8hB*!uN741Op3聩7p=FV(x+D)U;h2=AV<)B):یR`w+^gw E3ys P1E|^KtR)!Q6LJ{<ᢍƤyLƢø:uuф<u|:0][HQiGu2CwǓqbItr*#R] 2#+] }xZ)(+{,ͺ!#a I Fk-8XEq)6M ˶]Q\ZFz9|JD_ ̠AW0_N j7_vNzY[\ ǏSBf@cL8:l@xSF{EK[}ɜ3rc[˸m%NxaVzn ›xO &bs5jp&jx Tb|7{ӑ 8/?߲O|B8Kb!LR*s1FNf77?5rL-eE)HUebk侯;V"biԂR,̂X9kwJTD=^4OO-@to.TjC-p,\3$NHXy]lq/}W_|[ݨNeb)u`GՋ}jɑ=G.vN`S m1y[UZ36f>ΞuL[ҟMeWbBfᢹK?͏4H#T<ߐ^?J.o}< ur >^DxhxW#?yΖ^&>^Oq53BO;=?<<룀d WOo?1BJZЛwS~S,zx"I#0& sF'H(L% " '`9QJLvǪ3Cಗnw !02Y1}::JiI".^[^u3K<~x>V?ML-o*76ĤbR+H|F0&eT\* &Bt@2,y &B$Z HY5#<Ҷg|k.7 `} kof$ƌ|fʯČ UȺTij/kbXõFL,Z 4 c5cS 聝4;30 juk !vWL\cBHd#uH1]rΈAhX/f7lߏG!5"X$QLɷc WP2^uS)SXgWf[SV ؟/8ai&m&kO ȟPw2h96ț"Ķ!1F/ލB󫗍BN;F^6 ŨQh9auk4p+ݫ]6]BRdFZ.iq{Y۫624B1Nlu9ᘤ"sn;r]aˡT2C` N3~(Q a"B/V]y_*[6BA^nWXBl%_Gſ系k}T7zHuS3fMv~_ɒig~ ~wnJܹ)q]qJwxu3v:;brYB[D(oXJ+aBoMoKdO Y^m?/mak Pcgҫ  ctyԺ["zvq JvReB(\Fl(nJ)]v$M`*`dٓ5{tL`5rN o6|F8k+u\:ϟt1 RJ߂JjN%_ekuVDK9{Tm=7! j:ϝ߀"pSayV~߲ *%c49d& U=p1JȚOD?]rpE5+?uuLK:ӳΝNC./'ʃ1@/!5F=+:uq9E])\q3n쁻ªjօmQ]!9%^ r<տ.K"jо8xY/@cQrqI+M!B:U#jŸ.;HBiK++I[ זTiHx6GM= g_cKUy*bZQ40R\ UnjPɁ;Z^IjheuPÙX#7bk:b<ߴD3Cuΐqg L2"aϞ1&W9PJ.A()5ct衊\Fu;xSOtPWsݼtq0+E4IvBU2<"4} ]#) D:a&\ qeP^w}$M9Z*5'&-j:Pփ&Ow;Wb֜Ks'ꊵW{O )__{ N)*O+VXeeB%9B0Xk?H5meP7j%NAC ?׃v39ɧs;9DK3 :%ZFsA`+Egh@YJmӵ3)[-?͖!n8|{(m"H%kYe(w?6 +L<c> ~*ڤ3|D[cA̔n\=Ǿswz" XIchDzdlA&:5 )0-RAXAay>dߠ?oө5(r2fG NѬ~L@ L*th_Y/ĎQ`cd<!5p01:D+ED$Bٻ@b:2M"Rw *EDB@r O!6K9V`}Jez?JÍd4fE]AxZ0Jj1>^-^慨gЫT0 L۹vcqJ#b'+46`oNҔ]$4QInᣘ>۝# {n 4.,"*ԘDˢT5E>7֕*L@R$\0vϖ H ,@xvg'xwWW\>lv.fBXˈ,#m I$S M##6Tdo;t4ܼz>{ -XΒ`lǕa-Xs93ntZr?v/˘$rAwy1N Z^=H+:(WW f-U}5;㓽>DtX|WWG<7Mg~ud?,B,i1띀ds`_zم*3nj-w;1 ώie<4 3`.(a<#%#xҺPz^pܾ)@!TKAt## ')vSJ ]7uuy6#% C׎X^zkrif+dXм(q,.;ӿ?vͯ|>!*jôvQV@d$PvckvNDm6$Ih9 Zzۢ/"8= ~{o'w'NWdw rc~x2qɱD3ww׃ͱC_?,>{A~_2H[ӎi-{}ٸQPFXtފ,W=ZX'xdԵzkb>ۏ>ɧ5h^}uV[CfBb5zi|o\eRRkHJ7;PeFmaIQkj-C3!% _h 2xb vjy~Gg>hZ4. ֊Gƈ_~4@I/j{z/TPsr? OWUۂ 9qQΟnݯI%&{vL&}ۘ[BBk+Va=̭:uWW'(V:shDVgo>/Eg].ې~[xWQ/yU˯/ b,? vEwQƣMo!2R[hx YK#}02O=qai_YcN'v1 W\ :0^F C5+Q&mGY/Oqޮ3x=gTbxxHɲc>^lVv}:F.{wdb>2f.ڨ[Iv$7<-+]oZ=}"A{ɎM᱂ͦtxQe^\TPWOi[wau tB(x{T䍭n0խ 9s oNo\NQ|x/|3A0ڿh,˞p Ihk{ 9-&@q/ceb0p:ӡVṇ/u M>#T²:7r2^8F']&uqr:4f*땆$jNC_|OGӞ1U95ri^9H#ELUjCDQ}RD1qvEP饍㄀-{_#hm,R o-<5&o̱-bsxAz4wYK. 1/N埖C] ] ~ N[e?]*ՠ},ӰVpi_V@Z 'rRH+Fr=YX`;U/;:a/*riKy`"xLOY^8pKLxܤ{}]%F 1wBTgOu{,쯌~2,Gd&Gno/ζn[Ok&vh1POۄ/' 2YꚍU˳Oij>H/EopS}H/ISP[K:jQA:j3vĢ ^(pciFrϴL =G:eABM>q{Ԧ]rFk?y$Z#yജtF'=+9gq+)).ȴ.>Y~K7GzE.xu~2p]_ 3?)+* +J긷^ ((-9j hfh3q@틛P96#',~x~ m*LLqx.ЫAQ`ԩ&_:YX’<62bCMo/k R$A#z\D ,.U+KEXiaJp. !x*jo/uA:CY1- e4!^I]&U(3Q`  Tʓ ZhgDAMHBzv;X}Eeܧ!߫6 z|aH#hAU]ac7F5;lv5ݚv; 9![4R 5x?VyjyЄ[\Qh]vUްxmM]F+dt*"JȔڶYn`NjIBDRJZ0oHժ8]GrS,BIA )UZF 4Ы"U'ʇ0)aCTX4ȅ5Pz"NqGFʤZM!cCREPT-9yھIQy! ,ސ, Յpm%ScnݰMڭ-9Mۀ<ݔDl=\օpm%SrvC^9=S KtZv;)inH 5L G,䔌S'nk;j2t3? )](~ճp%U uw=V _~qWӈL? $VI'y!P*B(g% u*CׄT ,]pԕ`qP7z- #!İCFƔ0A ׻k]>wYkltS93;X2}IRݾLKـL\N"3Q$Qgr猰Q/R.KsT`ZErG}) ׷|k>ruTI3-%!JR_X7"Wz3!5 !ˊzr}rW7fOFG2^I&U;U֊&a)laICtPh,#yῤSi6\cu>-,乌R+hrLt3 ]i#ɿ{.<3f>q,Mn&G Dl1 z->ijҒx{Hǔ18':y=?_t-Ivq(~Lʇ6:V/.By}.3(ûp˰\A'{:̀*&|-4iထBY)0GHR5xCyz#ATT|g#A ͙7FsQ?M;56-J]uLWsd2: *͚9fPIÙ cb)(p$~u|$n_k(8he0Qce|Zfz{xޛ7)цvYSp3ƫ.6/7(Zf>#~ !6oq…!h9>%._w[}'lf5\UI䂕$@R^OLRlPr)Kx ӇD?)VU򮒕w!*Z,uZ2,KcҏD?(o%65ORTOi>o"X.^]ھŲǧծIY]뗯GfPB8d7D'gZM AjO %'yci%J8h@gCdc68+? Sxе](mDRt{*9{@` ^GX50'JAԪ &1$@8KŴV=pi$H]-mVOی)-.y5S=KK4TU뙒RMZFK.p6&pɉ+0]ֈR1h։_(q]i$\Xp2(D %4UWV@.lP]-~.q:hfx0U))I>h*ҠVhPPB)3t\Ź8{ZQeF/AA>]k552 މ,eӍ,ip2UOae*-l ˧^q8y>.V@ؕ{A.0\OƒD˴e6pmSu :iW،Jh?|Q?=83Qxj8 ]6uuO']UWVGL{X@2`@S.w< جHCCHu sHkOpB:h5[t#@p2umo-B쾥Ho+6BQ! ߩ&yi1djT-T!Ku&[%ZLʞ} 2@BԮ(\#Shݧ[>6PVW0B;SJ `)G8rޱ5#S5 ;f㠧 g.*y>54Qtd\jO_@J gG>>R=uw⍬c6n5]?cu[߇] UEX3]7!8 sC|b؈S:HmF/nWxiUKdU*;qWE|%7.3;+\2EFx)nq|Er1a0/ `uR 6ĝ_xJ$q]vSYY2geɜ%s\2 SJLي" D\e,f9rZy1[[?@tlEuS~Hk|2DƳ$)&+Be2^ 1Nޓz@eV"!Bd[iG@CnoLG=$Q46%KO.$"9 ,ZdHOgM:52F.1)\D3Zg@XA![UH=8ptEm4}fb;GuPk"@`$!J QxhX]T[ƥ6 Q抽qݗhpvv#TuAS{U%t4V Y z-PCƄ:) YB*ڀ*袀XH".i-tC6\n=J2}dD!y3Kޜv9oh! q-iXșTLはd);r+ִYdmF4`\)XYi;Th`4{Oe4{uGOʻj ]#Veiö'^][vsbo*9?cH>LUeMQ"9vVѴ&v&6^M[?z \/FOm4j%^2Uψn6*pMTV-b>^5dCƤ&&-5ٹZ[\{t gPʣ{݊ V!riyffRfԇy=J<}dCb~ &#DAl:ܾN3@%80V5o_C:Dγ29՘eL"`lkd!]3 A6xke.C3;CL=!>͍YE 8FH΀}qThJlI{Ujʬ4Gz@) Z)&2L |“$dqlhhm+UeN{@Z[1W)F +tB6]? 4Ҏ׸r]]_]iz*ffҚ#^AUEtjx'sާ*z%#s;,8!i@bJx+2]l0%x.Ҋu٘ \!$Duu]ݰbϗv[}Pr|72bFsDBoyx꿀_'5`?NQ&q͗*4\_6Si:pey<⬷_* *1`s",,9I')cWBj+N) k!%5KӋf&d0q;;3>p3tZB71tfH38N?pNK*OY,ݜOoF\O:n(-"`Yճx6[v<+t2HLd=mgyzB/y5.뻒MU"X-M_.yVnۉ^p;k^9G}w(dڦm̓\]4-Hy}wuK>Kdq1ˉ;tB\lZъo%|Ɖt ayXl|2WHFD5yhIi:%G! rdP9Np:wKlVT)keO*M^m Ќ 9 5Պ]-1SU Dg "ǝ唝7xkIbP3S)oⴀauDq?9\.yb-H`"״7kWDځny~S _d10P.#FuHmӛ c9ѳ䕞U+Ѹ28[m-h=)F]FrOQp-[\؂h jWvI6r鱝pi@ӂSVЙ`e;ڂpH;N@UBBB!Gpe>߹ރF[xvqw$PmoذAк'B6ZNSn4Lk 6m`mr?{Mʩ4(qEʳ tYo/+ՖwO>]>{1O Kq7ɒߜ]~U&^T^ݏdHP9ye!F'}֑<%%Qp:#QrJOf#!HڈmMV)ktߧOxJw+RF'OPJsSeל'![T0q6m2&TN^j楧M8ZDQFVR Gp(AXFF&@rF:1{#gR2J t%9f |咈VSvGM*;iZGZ9a>QCv$ТcTj0A/i4΃]4Wsp:i0iyC.ѝQخUɟ/LH 9I :%(}Zg $ ̉x i 3"e 1ĢpO\ڗ}hm5T3z.$qMYԯ*m+V}kϻ3~"iÔ$F sP MՌt_TzGY1 ӈFcw~Cz!~mTy cGEbm M+j00e{̚kydzwsN31yV5PE(?uQ[ǣe-ixԃ 1h3f11[ R8dnz4EmE{ 84x[,&gٸ01&k3kΫypF:淆5:zg5B-u _. qcGl!(us[8l`"mސwuƶR,<F'&QM6&bt=ј`ՑWDU)+'l]ğXzMkN4~j|#>Llh|\"5e"5PjR:0$m!b*bfH8|7(/ +]VԀ4rQQKW K0gN rDC70%kp r]] hmlHPDDFB>:.m5+B')`M/px '!fy.L[D.̱Df30!{gS*dc 9T :ȉ2EB2F6pA#0Hk:; 'PZ gt*5Arb~m*x7?sh%=87sO߉w>|zߖ~X |LOJ0RIOwޞRt}s{./){sJ cgsnnrquqs^;1=_\'a(lax퓏oS MN<2~_~%4<}ƌGLFĴp_q1dScrLuQLYQŚ:cb.#š;,YtS2u%V- /"Ndqz 5LWe5+b4RGhHϸw8cRf%|4[z%c΄Lo?u@ 8|ۋ&?:'ucTS#Vjℜ"T"MǔHQR#2&o)J?zGM3]LIm'7TM.r*扦< QXP**)h=8_4Θv1&5*^i*z7`ʱ) A钀DɨhcC/i_e=mkjܾuĴ1PJ5&FeeV|-L)>BQQf%ig9ES L8-c+p\2 <Ņ.Sb;UXK!BS~l r޷}L TqdzBy9Ĭ t _u˿Jl̕][mSޞ0 /![ #ϼ䮿~MywC9V=Ε庇ղk$|wy 0dEtwdZ X+sOi5jzl[b>t^X^dd\7U!gwJURveTۊ%"4H]F)b8WJ,44h4Ѝ~SDghYXݚQkMfgCy q٠*/xh7l#\Qݣw^7z#҅rSI2 p3tq>9q+6|C9?_#dus0JC2 ;p`SIgtp/p4hTo4c֯rC RNG62ʤa͐ P4Õ`UhD#ae1uozb6c4"̰1vRyՠn4E90[_^ly k. m}!YTf=< 'ʯOaV2-9BQլZ¼ 7ZI6N} R)d z n.b jLWRrgtk9ӽ[b (:gtpx6˧{w{V1xVwM6˧{_F/Wr{sgbBuSHl=,k4CpxzmܵSH(k]_zv 㭃فNupzw:Yo_ 8#Q/&H*XǝjݥoމΗG'ofGY{רsji5zC.ih=${Rh nKUzazHnդqtg߱N R¤JǵJd -oܻ&c8BuWBHHᓗaZ]JI.cm9΄ Zִ;&j'IZѸw:pzof[N3]|w<)̨#΅8/xӑ#AҖ/9agTN&5~DjCF2\\}DX Eug8(S]j|O%tHĻZݿ,WQ,c/srvF"=Ǹ{p'̐I<^+R֨:NxmH ۂB 2VX߄z=fŃ3}̷Ry(H*Z5~ƥU%RTҔB]$M;wJx\0ߍ*TO<oTOGfm=F'\Pkфe>9 gx(AOl "H,j_+W͈7Qvh]F-Ui# DC * -PT)M@]YAQJ6(WshS ttռs*ٍ[QTnî\\}2`잰u(m4/66VG㾍f2J͐%Hcl,kFX[o\Bᑰ1n"\o1Juyd\}@xuq6x\q/w+žM k" Й/"8T\B]m^6Yk1k ^]|y+hބޔAJVΞL QwaHQC1tF~zCN$k%O t:E}?dC/iYڞB?uB8*e ZfT&3P]: ln֙CHtʲNG3t0Rg- FP:PZ bQiUę#*ʈ0Ў=j!e`ȨTǒH1ti!R0[/pc %]`\sޙ̛}9?d 4\nÓэg `T=[h{R %j%о >LyE}? wg(!Vs3Fljm?>SE20k)> ݖOyx]ĜˎnX!9: \-FK/08t_GZG{'r }JMz{fUjݮ,ꕫ{Aa^ӂjw-п_]J\2ERj=Y @S yvQfqt*G©貐yy3j,3$Z/vLDyF戎?65VMq˖4 P]' ݏI 2ᦞ/h*ᦆ8 Jǜ@} >xn c()acQSYNs^4bAj:\ړӲ(YN݁̑RT(99lTnl-cog16TWe}9L7 sotn3 ( RsȈBfN{D~$ 'ͬĞ(_EK 4RMw[ B]K&g.X0o$Iҷo'U}y^zjQZ! 2h)aEAy XQC~Ny@~4Ys :i$g@@yCڧ"*7Ei4Q<Zntȣm#:OZR ƃ有ςB `mKԓ:IՎTۚru74H6|{0Y|?>HG؛`liM+*iC^Nw6a_z}KCPU[cjM#2]9cTt ח'i}=_ibɾp$7OOO6BSHTwO/y*~%-><>@_PU$]JӏopLfh%*߄ſg>_nr?sq?$*!3'g9TDsK0KAԈS g@(3aՓ[W W|(~tR1=.e @v#bWo yfK#ꊊue1k.Ja RU(R2r?Khٵ<6޺Br=#YsĊ0ئ)f]N 0%H5㙩Q75mU5 N x۵Q+HWêaK+ a+8\$ќ{A>¶yCGqNEOٹ}\tv(}Tlt[q[;O遗J :DJ@+26_U$ )0S qA k}d#q*#m'p," N.*%uh K]c,W; 48smfCsum}dkխn8/HbX2__WKq)BUC!rDSň ({hqḰJ/k!@b֔*~~{4 V+jY~=8ʺO?~ t:[|uus5#uH*EŭU=IV$YՓdUOVj/p1pTD8p%$v;0Y&HEeHBҫ[y iGΧLoK'~agywwyKH=W|9~ߜ >ldxZ6׃ߍ{h!\-q %]ɲ֝3عBYHX\1^m:g][4=vh蚅cO%%n'; &v:~/ N~j£l&],[CAH";OR]SwVJ"r`xhɂLg_kͮ Mb^T˸Xh*G1s16 A#dXG}?&QEl5 9( R]KlI`;q}T;S)NUָpz;*o@jvTU ( ue:%^ aLqĪ,Rs/%q}|IPPLtW׋nq ~MLGvM8ThTgn@K*(:^h#< )bbt%T#2r@GTB0'ZDH-.vf[d~yF;E.D+/>"\\ z0}H ]a*0K)@$>X/@m5fD͢;?pާ i:h–]eg+J_yeOFEU]hUn'z<c6RDF1JR.SPFE2H9LL)ˉLccSEIL*8bIB%H'71rRJ$ccb8b< HaĩtkFa˞fN ӑqtd!vN`9^%_sK^aE8 $`PYU,|/th~ !MM#7,g. >t3n)Qݾ~-=ip)|pQ?X+<]o b[.Y;_8U4قRqL+;:`o* d8X~v4*a52U,ߪ/}3}jl(v?Qspզp/3ܝ+UsUgmY"Q7w_ iA h+0A py4S͑{\ls+ZeRWNl J PMW) BO5.Q5l!أji wsIϫoP ngX4Ori}<I13=,yCk;f]X4$!/\Dd_'h7E0v GtBigE3Yݚ.A2u.ՒW;j R[{iUkox򾐽[|%٩{.W whqͭ{mGo_PQ>/-K)Gzo }[dg{5 `k L:2ݞV.(%y_`+ʏ@6*51IF$%DHqd⧥?iR*,W|֪Cx!+LXwowSl >&*^VD{k>p?p/^WqgP{s˵ο kе$ZRgk9Ϟf5_[}2.n_h_xJږNxu*̼3+BoųCuS/ynTH"'vCND]T "ۮH0/-d55˪~eUƾ5r2Ul#|K I*YլsnV4<1~̛y1Oa3͟(b=Is,z#߽yܾ^R߹~}n^̛×pF$2!{~A. +@я[ ޝbWٻ8rW~9e/ApNKK,9x/H3&%kaQOU*[߽f9p Jy&oVtsano ߹Kx>x6^p*//>g.>30oqi{LJ0QU6(0Fhamruk,{'| ʖ@ Hv%v| L(.?+wϽ1ęIEDqRr~Up]"1`7a8v"J1zØpīX4aTpX&"++07>?i{Ď-# J('z徒5$"԰PN#jR RDtv<6ɴ[DS[ EtL 8')u"S'Obgȹw;8dʙ@i}e/(}^չTbŷh݂aX1!xHU`' V4kѰ ÛYdLxht~4|jQ5Glň9l|b.t\2<~} ӳ)VOoKRƻ/Iz89bEFltJ =(hy"{֓-"0n }QX+Z7aOן9}"qC՜_/K&w%NEӶC{VJBQnxC5Ls< s* NPj;e]gN`IyS#2I1Iޭp8=Z5k>9`!D/PqV]>XZ,I.+.hGZ[ꠉWR"$'Kƴ$xp$;#IWyڹK,$B)8 .nbDH*< &Pu+a ^V0p90Wn…pu*eEsj1QN;xM~O:SAJD9AS6]bnSCBq,Shݗs<$~ԜjQ1p&iu7'o-DMv)C\ ,oiIc`cscIйN/I}TK'ž!8sʗ 1|WĖ'R=OaP]"k=GhJyֻ N}D7eJn|!vR:4%heO_1פI5} c7vɄɛ3L.;(ix", tׁPM7(0V?N8_O_[gaٜ_k؛k@ 0Y!!q+/g0hb$"n[޿/ -`殄j 䉽.)dsYhFv(ӸXcΆOK TPYŕ0тs=LG/ER[k4vg]ZlL)ߘ!=;$V&f}A,pʎ}E;w$߫C4V&J8щN 1$#/W g!XK;G'D9kz@0v210r\"ǿb)Z䪚cŹ?6YRM?]}aux{y>j1Jϰ.(u be"`MƊyZ ,~)t>&Lvj,J:]lRb{ =qiڲ<(x}GeT )*gԔKۤޚ|B̈́O`U'i7GU(M5Sp b0 LJaB!XkfE%!@7~, -vT+G#bW.8X!ݔ7,7yAD' JT͞L%8Wѧآ) =s\ ImAkn \[ ,MFt*X,aUo<>bJgң[PXcPi P -pD" CpE)355EVABF0F#ldOQ,Z%sԷiC|?j3HV(=_@FA=k𕂕,dA_kT3x#bN^GxXz"YHJ#=))bb+=[l'8cS/=$Qz` +;]XKj1%we$󢦪-g-f +5XϹ{8i[mW)!SvKùmD !5vW`TJ<6^&k 07T㦒yé dTi Q.5zIKN,p4&◱Tv偨V5JZ_/f{$yp<)]DS8'_VY`Yqh98A!{yfQfP0AY K1Ct:9Š*R5-~>{pA0f'n4k"c!%vW@'Bʂ@uMUD S>Dl2[5r{y ,jFφZ A6H* )t5xSS2f l$ÎHŢs~ wy!Mk4aրi()G0'Th0H*VRa1Z?V:ZO4KMxColXϨi%bӵ&D>iSخ#!31Q/1~kM36֤|wY>\̓'ga2j938*c<5Wtk C͋&Oɴ rcvj3YSUIOOc>B!F(`rvhHe堻v!|i~Ý _wq+ЖYlN9ƶe]\}ڌ7s{\w׹v>WΖ/obtB<'S%%{{׫w,/8˓/zaUK3"i'߳=1416,(ִ FHOavУ=Z/D̉$C $\sZf ;Lns^}Y.[-/9-wDC= k[֘iQ` WIi:pR B3iC˜f rc ykA0=gckgJaT<&wgN];%D)6]:rӵ~Ü'\5cU:7J@*&F-V MZCXE"+I;bc)ZsL(L$s\1,k f-%@PՀӀdf)% 緯WV8&\Y :GޟK(O cV [8.R e2()#LKg(J'MMs-/[S x!rVJ2JCT"9P/wZ-NV_޶RǬbF @݆u,ͦISG1h JīHK[IKr#-1 U$RD'brmTMgJd kQ?Uf4uAӊ - :sDZ(9 5燭"LNB9ew\ʙ$w)&MMJ!J> Vhz[ar @L#y}F;;hNכ]JPR]JP})AQbB:P{ӱ'@LbOZ*ņ=Iό= !Ft =$ 'ga2j%fq{?ymӓkd'_*s8?~9[^L}5I-A+zF~ u"e6dOWϠzfܪPbJލS")2UɊL7aVɊN[\p a&ŭ5b&ft4=h%"8 ֈ7BcN{ޜʀQQ( <IcFuDV4ϨSx nFșQx  N5NucfHeȢ`J@6FJd|P# -I*P\jM$.UE"0jgҚe5\+*BpK ةS vr1pWg>yqaOn.@شfLx~s~׏1:^?v7Z.aA!GFJsZa-8q,ZB[`ŀ8̺S}L\U6(tIQf%FiʹJ#ȣ*R9X8j.*s\q#\RwA YCSqkDݵ+ݓH!tb "ZԶ~ʥdK ,Btub'ՌFH]bTY,N[#rFYwmmJ=;.k/Jٍ+Nyɖ (II#^$ g*Lba_7@8B-X0х.,eK댎'R6b@R>3|ei`H'Uz" g{6wWpDZ$'};!׼ޝ$R LbvfA̔s 9ma6h+!|Om~+A#&5YoHjz=1Jkқ91B͋⻂BHF辜3)98S+e d 悁lOif@\EiHɚ^pb·x;^ӒB+(*eY)JHKEєI.@#2 +!ZzB)NA|:zJr g?,cRIn:FW԰RYF*J}  kDŽk!;:ĚCmlo,99`#ƕ "WT(,IԀfh0k4Ğ:UL= 9X)Ʃ A)Sd(gSR 4R Ox䕲t<~LűcAX't{^}|zN|u ?|yo'qՏogE./ߞ d:[j4tqbg@ț\x4%>?܍|pl>:ED< f6rmk~>74jb<^ӖJ;bU9-l*N9U{"<{OdF :/3;/Ѱ0c CFsr Og 0F%#ld#e)*-_ZQqtqCdInםs{{7Yݟ@;60,&$C0T- GS~PNC[FB4FQJGE6_oٗm,lyq+ NTVxJVP*N&DWKNae iLzꊵ];v:L)u)gj$Rd޻6xDty C嬹ŏpVc]|+_.W/kZ!¥ "cΛ2YFgMUeɥ*T7j78(gh& EӍ,^14k5M")5C#H)PjiHF .-ڻeWꮂW1jd+.sfVĤ)d>!m!%Kjњ] UA L8.K,Rf2I2-'oPBꑬ=ZVi{.-%j`n'(`f11dDN)X40r)X9@L{ qsCMiX?4qNS)Rq=b.k!ZAV1&5U%iZV܃zKۘûQ,P _x(yx]w/'HO?]`(N A+lɠ8+O+Z/zj1PF9TbߗF^/ ^Rt.G3[qRz< I1u-Ə^50wѫF tW?N1yugKch蝯C"?Zz= zTwC} AS<Ⓑm'S!H< Ho0dV| ;2TTG)i;ZIU:R$qI%#]UV [C|J)s񙦜o lA]Yx 8`i9 ZLJ+IDk4].b }QkF;hI'yC^w тj6U PL1Y 0%ϯ4Sa644)\P}&0c=`z#Z #\5e>MpNN2޼ppw8Ym ҪF tI+ K\GQf<߄y0I{s_tD糟18{݁`ggz%]:W8_`> F[{O;3hDTn홡s۾E=$aP6)CM9vf _Hݨ\>|6HӺs|mOdIݘlwtFm9ƌV ДwgU޾wR+&yv1{頜\q28IR0W1z]Lf)y \ܓLUzӖXH:&5nt@Yo)Vf$_LGP7|ڧի<|*;e-YȟDٔu+ݚbc:MQǻ.BɅ[sLֆMMiR_kƹݴһ5 tw;,BKGb\ւMtŦ2bq>9Uzn7:A YukggjΗ>O Y+wHJ[F-d9$LrB[Y_ZRZJe XSKL4fˢ?X$uxOCETs[NFVWizKYPie%x-JE%C1E8 2 "|cʧey4paZM&TWðua=Ȃ 3t]keQݽTWۮzdi]l=Y˕ꁬK=q ڬY讱VVi˵ɯ)y; 90]}M>,WDz$DV[w^պefW.^V%jsZbUZCu(Mi#-d(.Wr LqE]brU )B{p4Sғk#M7E- )rreKRiw ZBRƬe?ߥR2М"-# W#~,\bh̶SiR1MJ>IdzDJeX`zk+(%?N)D {^ZZz)|!KL*s M_f+48e2,ݪ}޷dkJ/qbզeڙhgrZF;yZb䮥9Q%r NFzeIiJ.i6K`@ ʛz@jZdͨ$J頍ڔD}QT`U>H <@%hMil"W=SrrpCU(8g)5"liUTIJQZh[%tS ;Rܚ&ii*uޕ#"e* à{Lczge$݅KUߠRGJLTY r2#=tʻҙ +vVm;NFWWMin*v;iO0Cpm3I \7c5H.i*kְ!Tlj5+w6F<*#TJm'Zf( 1mm5xon&+b\Qv7v Efؖ*eڹbVZVK_wau;AC^mCR6S_shdQr"S( t28B9d8|C=Q Ă*(ZM :" jB?a!XyQBY[:`ޭ5qi 0qBHbz> ]E6]n1:+mQ& iuPFW>cxrK+nA9䝃3f@oHoc7h.Mlmv]G Ep2 >tV?LWߛwZ]ӄe֤LCЀj߮URB癢+ )9}UYOOi%:?js٧:; `ixxN*;TХDk NQIFBFu=4d*{Pj\a}Ȕrzyl#1G _}HUg>~)F`Zu.lp6$H6}8.Zwx@sЅ@UTU>gS#0_#>xegCNbW<cM#5ғ&\ ;jYx[8oCؾ!2!`fX>G7@.؄H ξMf5Tn-dop_'mFJ۰n.xΈč+xSe^"ðsºY9̗d/J A A7_T&9@)!x{9ůD\(E5(0sC0 eI]R֩JAh) zx j yI2J T.}- `TIu9BSW2 jP4UwҰ4ravT.g|4I}ZoȨ++Bswa8ئdRh7"YCh]gijHZ'π,]wEc.a k…f@#(8AJ6<@. ӭƈT7%?=nƹ7OQ>J8BlYJO[Yfbű뎥o<y%Σ7UԪ끽LC PR2ݓ-,Z-pÆh.ۭjpǯN)1WbtBT܎V\x\Rt{Ulw\:w0 1f\*<<O:N9@i 7ҥ Q0uN4f(Ձ +&ZƤm^3lf7TS\3̅uvo 04ڼ }Fuk3TO.tN ]_zq~?mhd2)9:- (>o`Xeo5i_V);2Z.S54SLzE9J}`/3\ ( \QEzפ>$2b4RV0*NyS1ud篚 ٠S5oA5 w~dyWcϏ sTSsWtKa0"B@'7沶U )k*bpfR Nx}II⧥?l&$s,.jp7p)> K"#ֹB8 * \!1)NgglWYlE19wN L!- }Gg #Pأ]e(cqPԜd(ᮧ`y#Ie{h?򚎕:hg0/nZ=zߊ)WI$$-6k*;PL{'k"~ Er1GܥGӿ"-_.e681t/k :`F%%B]+3kH=6J7XRetv)ؼ:UzH|? R3t[(`Xm4Դ%ApFY^"9Ȝ4xk3E6=o^vuhJE DLTb>40_p4S+('CSƇ叅 Sϯ;ӎ紣9~.sC!Y ༈H*W(Cl$Yt\PTVI*J˙J罙=bOOO3*rDOD$̞3'?CyqrS~Ѧ @XMx:^2bWJ2Vi !'ӱA.@(-Ա^y#0.gR&1 YQ]Z!4 lM׀i` ;D[A'ᜠ]oBOr:urD`@ O62ʘ/`%d$a %QiW -% --:} -x6F u "D;ã"DjiK'd#BgůpP*zdRH5HG(seR=c8YhO/I=9,[CNdw f:}7y~x*=DfMzh,ZoT*{wBp A^BᎪҰu}+2p=6?w8Kň[~LotSnp:lF`1ݠR3wӐ&^-GSE銷%)W oyA[M0Ij xSSY㦔1]lڊ̴y%Q eԣ+zI݊uomI:BPr4c4*}= u;0 mZ e ;ߏ0;8݋ d.) _r bTPη*ٟ˷r5@:Fup@e(>X#YxڱRfdz_~ZZ<JcȔ>*k(!F<ݟXy'QctCmKPQ}cAppI+Ruԅ^_3Bt̹j;hv:bd3ofSe)mJ ! *F Y×č!x ^ksшQ Z:HY"6'EŌb)Kz↓6 E58@܂>:Cl .KUH0rĨ$g!49tTF0NkXbhW8ЪHB,am`u+PRSɜV F0sˆQ1P)hVY`Q0WvgRjC;7'Ug3TxW]* 6M)\! wtIB8hM$!$䕋hLiJU 4DSFA@t ЛO(\n[b=Bpafޑ;(بO`nS@ FӚFq&;&NS KN^jO1M; 0l%$(cKRu\W?J]OWx rIز$'~dίoV[|ngAOf6$-L?^\}ܸB>ܽOB1uJ5\gٜ,iW_KCs['Kw |oo(u.Sݪܱx8iwon?6h]ޮnWh~X޾y')}.j\s߬OqZl ro;7l^_YRw_Wq7T?zZӍ塐|?UQmGxonoĻB - h-YW9ߢzb^R/ՒK,܊꺕vsIc$I|PT@e 1 UAFn5h1&BOrTKQ_k)NfDQ6]Wާlyݎ`3evS9L"w[早]q!f$%>~).m؎؅B.ųDZMWDq;t\+»܌8/y j2_#笉%șZ$j)Fo/ζ.nu~s XP{PcO(bOٮϖ=Eύ=OE~2*Ӆڨ 29z)p4oDe~z=Lȱt`&LUoݥttZ]E#nJ~!r^:ׄRKllk7OWT6dǢ*Pp*6 GDkM7*8/{ Έp]O #hNB,)8Uuhc_jZ 2LA뎮2t]4ZK2( &jx/D@QS)$'I m@L -ۀGfe5d5RJ Ry--hFS$Y-;4ISITZʉſJж2Oi9 VF@wA5NHjN1NZ*i 7J\CbWS*EAQ2 t ;\.%.T!mi,8}:\w'9ʢNzfwKJb9e3LޘmtLQPfoJP Rp&/tE"CH+*j e0yxr A2DkER1|2[=R;r)hľwR՛(OVTS^0@!;;OؚΌ4.PϸIv(JwJS&5([qOL Qfxp^%G>G В׶;|{҈-+8"y%G33NbF#HFf)rQUlinJk 2pn2D7&LxuЁRƚU eLAMvU%P+y Nd,$zEcמQ6@:RpFyRCFTt3-:ktMj1(\}Gz&l>Fpo y"Z#S/ ;Ms1Z JD]ғiꁖj!$䕋ho$-wυ~{O)C@ ĸE@.ѼL {[T#h)4h-jgK(%}k\m*yt{!v v˩,q M/m3A:YsylZq MTov_=:g18%֙B JϨYҀqWmr6h2b~ӯzAdG`j{nDX`fz{}vkMw/G%_>S[E9K" 0u/8si3ewR(X UφKFGn%#9CKuhKrZ&/=A9ճp_ &!j7> ɕ@%%Z)8g(3AldHkzVuAu|6^-.>?1m/\FGNû7orlKQE7wW hxe*  =F+'p&4VN(Cj$֔ؔ$@7O@&E7Ɍ]q`L )ӧ= c&^@]>aEݹ՛tJ@)3Y-Q˖w%heﶡ9Ϩz{ݷ9b+_Bf0O<*(nzMvS+y膌v(c DzN cX ;PHɼXثLa9ҧ&:6aXCsU-a1&=HB]EV˟>vu=o\|n ZWW?>']7.v@P*TSsB Ms^+INGCGʛ_ɟOyNM[.+!++f c_YqXV4 VϿokJ뻓ہi[|x}$-5XsE\A] Pf;C&dUrzZ/{%#cT1R ʃ#(!(dSo;؝Vr2:Jt8F7 $ŨJ1ʭ\ uUFV\E8F$@N 2-KÐrle1y <&͓JAT1KvK:wGR$j,(=8yEJ:³H\ '_ SQ A-IhtTI86Ic-Ȩ)1Tcz@7ugP"*U#`9G! ST:..$z]NTh&ftHe I7lVR@Z{)]Сq/?7j'Lk p^ 㞇(&+D446h$(#i]Kƞ0g]5g=ns)*(af&^5yP0;ņ10> 8'jgmDtȊ3@?ňC^ZE?ZY=08,$0tڐZ@\xɈ`S~I(K2,aLJXؔG' (ڀA+}N"(C9C3兴 _:]vkQ0f_ҳϭ~k>4o%UxKQx飭 'NN7ޝ\r.XVE%D9q1!*]^IASqv>4ħdC{ 18]Dײ<EQ8%G c)zIk&QG%ĉ ɒWxM#avޅZ.9@shJ[t86=,`b8מSZw:u㞇r5ie'z0T o.nP?989ב۷qq~ss^=7O.mn.']Y=ˁg9۴ Pl?]^,kOm!sP_d\n/^޳o-l !܁Q=a\QQkњndȆ9{֌[M1 ^ۛ6 %fZy;iݰR0 gg2l&!+wL 9+<Ä5zX\3as]vQ]vť56d T3 G)4ّ>5&cH5:P?Lռn pd6Uֳ:3׫+Kꪹ{ \るyR$P D!cDRDYH"AA hVrB4@47{>atg`24l.GZ@s85🗫o1o?M̼s绿lVC; ~f'\oV)B_.f^(69%M=ӿ4>_^f,FzQ/q`fj y"_XME1yiϥGůIV;Důۧ$6:q ҸdLCIL!G, pӰ{e,6`\Lҷb hlO1D!סFJZATt)LK\ZjMBԊ@H/7@k<=@w9PdI h Q5.oC{"68 wKIHƛvn7/I03")' )ir^&-r=|nۉ5HLRyY ˭,~r[,4{\P'RJ~OT9a,\)4G!&Q_bU bGZW!LbxK?Bf-x)?9Ipv=?^ <⒣eyb>VD)8,pҨfwO79W.2Hu3Bieha# Фxw1 BԥH(ߠ "Jr~6A)r8Ml(H\/Yӳ `KkwK~"iֺv_wה#:LSR\6|9Wf;,7-H#]_ŽnEʛGfNG Z.߬׻ p oc1a Ҙ-1s,f3_,`Rs*V< azhPmvȧwٮљ}!+'Wmn%8"sSMdXmH;y%V#Vڧp1?㸹@\!M2Ĝ&"I2gINH[Orp01\aEϤ@%>]ɾ=]y:wO9Ćヵ(_&bRh3tNǏYɋB]#eO価(E[5)h@9GOMjneD4$gDh*IH5TJ8iQgQԿVȂ9MUN$A+49C3i5xA"._A_v|y$7RY%x+&Ҧ|2lEIMJaOܷ!v;: 6C2ӯ˫OT4#ŠyY_M+׬=CjDt6_ ě+sπwşܙ^myKXrp'.9.x !g4>}*c|%Uq ;bS #(F#qOVBU7yR=‚T}p pf 9nCQ(IU:dR9$! g ,-B޷xn^ Z^QŪ&YQ'Ao;)jh,`b PΎSbU$VbvQA`#,]玱&. }wbv )q2Zf?wq!"%.ǷkEb&v[ A"]J#r }_V` 2P^' 6{ET'R)[U#Dlj؋( z\ '4Fч{+87Hv #L9Ec*v@NYtN𐃌qcVǵYƺ>bmFxlTM|x3kDz<<*4xp% c@ ߑ{78N Cg$KU;z7TX \T&l`ޜy`Ŗ:}<2g;_h|(H!.?gy?#7VIP#m([/9' OKꌡ<2%`ifTS'% M4eR4M!S$*GQ 8'z+W C:@&ܴĝOY ۉ;xq';HܹmܗRYAgTA H(!NwӼ p= tr'w[ܭP yui@Vcz^Z,^X%̂iwM`(`s([s~~%%xMoVanl9ư=n?7׍(1M`f{,F2PV@ wG 2oΜ׷<}j zG1C.UnKol< CxdԀ"'8Hb-BJ]:p++nGuZOs;]tѦ %Ɓg,{/l?xyrW9X4qs䞬دZtZ 6m'MvZslv5Ť͹t<ں޶ж % M w*rӭrJ{{ƣg2<()HjC='m][BAa[;ԍEkheQnk~eeLκ(8ipWXlmS1LPNҲ* ,OԸIJFAey4DP#hZqm<9phx8̪80A/ Lę0˸3:f1-GHQ>)rMSyV 0uհʵ1.u g1 2i"a*'>G'qO9Z!ZIK']gF0B˹7]T c4ɍ&(ѡ)%CT`i!YҊPYzQkH~ERKK{ͥѽNh9) spBP8A%XnDAs8z[ 5E[`WƓ#7Q718.{NQF8`fQK)!j+*^߇\~UK#b=ǿr#J^Gq]&!JWObcRQH Q_2\Ux(x_eo<-OօQýb!"dN+/+)LqeT}Y.rO:[t❣nzѼHUhZ*ƀ;.P@{4s?>Pr_Sa4ȧ{ tWe0K >|RԒNC /+<}z}Cǩբʗ\ME-fߓޚtt &:_+cջÿ|X ܿ;x6qd,NKjEd,a]ۋ׊:W_t:N2XWXXORٹWU.k7nlt|oûQU:\;wATvUi[ nuX7mJ*are#\`T"xQ' 5Ck=DphHZs@ ]+)RU\9pǜ!sp<rbd6?1.L_i8@EҼMtxrd;[\Av-qR#*Ù։n ba ws|l q\إvzQ;\~TMU,]<:;ioRu$c 8o>\xF*wWu)|s=erD%ќA.Z&Bl%T kpK\kBS3rc]$VYI43J8$=Mgzȕ_2;Y;3r'J2bvZ$G-vbH .`md$#H6tPϭeLAۈx-7 eNb-oo;[v|-N!&rLARtch[e'Ĭe"\)||j4 OДmm_BB؜ k pS`x hqY,b!߆anS5cSˑ2Y$aka ~aǰ+%-jiYo`#= Xo[0 7Ȋ) ˆ 1'R5[\^7K8,{|T'L5aT ׭Lj-J+%&߯'~cyuZEXݙ8J 9:]F1"-vڱbRLY<`9݊t:3t,ICZ5@CNo#B- D[>=.1P!ŵB B_khaa@X4IVy*z]=?\Qa$mv8ЬK-ЃcS;|(~ZlSfH,EIRH|"F ċ&Xҙģυè,27J2u ȞUgzZF AevE'{D[ȑ:gVUg$a)ُAVfzz_ڳ$oy(7{h5I&Ġ}`wÌƔ:pT&[sF= / FM?]co3}*mf'@@]*$hR9kX`0%! G%?0a.U Q^<ǭ#ձYFrܨcq8fw?y~ᴰu цٕ8=w"p3C|jg7'}C2hԠ!OĦ} kċw(},,Ed3g~с p}#A̱s1lC?a9D~YxFMܙRT,̒! ?9UjiK>V./bB,SG{mm6mt7p҆@IB=8uPrQN:*&?% \9 JEEkسP|>9]/xqb`~KZhyA%oKU%7pH5BgZ b@k؀et4L*EISLj2T H7x7E!Ys!4z(̘# *X Iτ%dk cWBd>:8 R^%maͲe@(gňh@k` 4sOYN8niuw^H]QD1*3V@qFXnqeD)Ra@gdC.7?7Cr?eZTξ7ϪǞ}AHuL WC*ݸ1K/"0,'?{%^7~ .*onKh?7tEmn|ӻSfTk>*>͢?ʚ4mB(q&,M2P%h%ѕ*h("ʪBl +x&$.HPg#.UϺO7@Ԗ+%>B\\>@jd~LDRzJL-S4 b9 m\/@|X!ll+@vz8o-q)"JaD2XTRY'Nj g,Q >8jza]:!m/ΥDl^Lc,fmd^̖sCJ@ }x$aef=6 ΍s1|.=U2 t=1'aPhKx5 +^h~|ҽO=zkR=*nL(Du(z4]aD]$>@\wʦ#PbKeRť(v(Yξ7` MYyĉ8/@:h] b!5r:Sr}XN8$EW4yөp I%C*?nth6:;*X HI9EFI)IT Y' ` ^~lnQۼz[֣?A]^ㄍќ~1qB Z۟]sk}|MkO.^9~d6^SN#NT4!X* \Ή&*Gfgk;BMR^5s:IXd[>P*:9xʴ|tVs 7reZJkIPws~/n1eaq|َېۇՠ@Z9GIÇONQRF ,: +MFyPD=888\g8!0@HeI!b^K+KZwR|JI2Qb%XVB*Sj/if.8gD'}۫ޮ톕GY 26;Q@j9:Dw"߿ 2e =VURz߯]y|₧*Io*"TU%ƫw<a8*yįqm Eٔ<}ͻ\gۥvGk6Z jң+(;eX5vO_)JDgM3Rƽxv^IgOUE1gː׍=PG-H)a+'F4 YCa`δTxA^.Y{OVuMDdNKT .ij%S<F|`8UbӶH il5í]nSpろHQuƀM 8T/sQx~;MV_q]|>}' id}.K\=ir{+O^=h~.yq{]'bG&-ʴ_n^cyQ*h`ɉGȼdQ O&^=?7 Bnj9gW+WWD.{HG='XRnv8\n=%:/J+-AoXvo!b[mh{$- 1 H6&wxF FVVq9kFtM)!#!~XCz(򜈱q%|iuA)qً"̸i-FiCZa҆qywvQ+49o9ފ45D闼l}n-8Z8ݴ8i 1E7/ω4qJ5WkI~ݸ\A@q " `hhx=!, %-w<\<f~x7KMu IB4\0:QIc?`<-iEIyH $8%: Lx,L`T2(+#?i%A򌈿3њ*ue 1l8{yf1 ꨾K>V./bB#_ϏKyi6/\T P KRsA$VHkOsbMjr]̖i3A..>(o E}\Ϗ _fЄŜQ"`$q}u/npseiW$]h=J/z̴1r46QpB"9~=6`*@cuec6 ȨmXQv@FjB \+ h<ğ3:)nlqbhQO;ޒ)F8϶ڐS*NrP0UUJҗaRRO `8zňZgʚ#_a̋nĥ"Ɏ ر/`Z!GlVcw'yT_lPUݤѬ._&#4)M Gw!ApInUK5.~[\v4mٺw҂ch1eYԑ2ǡ%5*:#21 _4DP xaJiCMRއ&0_'U WW\y f9̽Q@9!gYtuFW_d2t .nF3h_֟{ɁKEd[v1<pY‡o/: sg͚ACrw[=zٙY'$g)#Kh^5HTOu߫I=$ 58e,(A^ȡ˫?)цnWI yM2x21$ u gՂ.(<$'%cӱ^XݡC`h~i{9aUC JM\*˵ U#Se"%Jbin )!SƘANf0C^tOC۟]S5WSe702ZA1່}QEG[:;zs<ױϜbŖ*,ֹoת'] IP޳]"i v>Yl0P:&SRZ+pV:JY@yc `@G`ʣ37OU0ٳ֯;IEhr4҄4-ג*Q>U::}bH O&*l>v-ǫk.apZy0boA22e 'DqYK/^d-jiC$83 B(ecpyW\DPTʃ&;w.ն뫳+$A|dۘ)V6Hc͐u ' $n+:H(fo ' <אo~ȹ|g|DɄNP <: WYVQ9c:*J@phQ@ D;5}gYj1x@jBDɃ{7 |>+ͯah[vyt'X5vbw HI 1Ne ) z0\ڭR#Er*'=++QPvdeMd/LeI2C4.[ShƠ.u$RA MSڇh51nN."l~ h咎Ppc&%R\RYDt~(zx\R#%Źӓ sw7%KTPc Ke%%MTSȃ :{HzFGG28.I{Z;,y}Po$?/nq(3w}}uJ '%Α_E F9}=y~ *ka9@!?sJ4z=.zwjXFg2/7߼6~xv}߳9{,o .h fCH(^|T U7o7oۻfޖ]e%(cs"b'%[黭μ?]t /oݭ}E&V(3"PG:wkԡ/>e|e?F-˫}(" )]q\ki hLr}+1.`NmNKbSfp]a#p lVj&< T<I!3td3b9'`dpK{p8Sb;[@i"i%M*¥ꂁW|`d9W(ƾՎ 99YrN1F Q=9_~mzUeG0@:&߆dp>ô'7C|YmUүͳx̯I\˺ND{3]j538cs\`" C(KRIf5=G#PJ8?2 񂦟Lث(:n#30͜8KSϝٞ66CvVf\ƳL(@ɟōE; 1c'Qh]FZ2YZq2LƘ81 KvBbIDID&bD+cҧ ]T"JRbq*(8LNO,INi[N/PXkg1RuﮢyA:[V3~~-_ J=mܯ%eo\ZkME0)5#2A FwJF:W?y|OYO<6>S#-vom\1С J)'66^Yn>77;0g?a6sUw¬N44*цP 1:O'.r"'.r뢛&n!\[R0ɂ A\\I<, &tNp0uHܴ6Ԕim|C?hJsb씽вU篴e@>$.If04@(B 2: 5Q)y\P\Sb"wD '>'!WLl2JcëF=8װ׸RcK7iH"y0{ I3G"Zp cKAp+uACU.B ^+_4"Kٺ"%upA'{ɒ$[=$@YbB'|DvEs4Zkp! $I/Do\KLaˆyֈ]\4˱@9 {G G 8 ;G+*V(8zi7U"I?VSD`%2I}Ky:qMA42iC"оL]F3;mN1e]Z0]9Ys2&'#~sѯى %S F>hr|ӷ+1Zx$*<\#tt"G) c^<+G 'Y@Gdδ'Vړ#@W'm[7[+%Fu;ejd֭ym hN1"[֍A''tD֊A괶QcݎL1̺5Ժ !pS\+F\e[OqMaפ`2@Ħ=/ni<1짫}ǐpɨqG ?ܲW__V*m/KbP' FQLh#TQ Jzj ld}/aE2sAd6io#`)*@\ʸ꩓Z Oa'@;3ED𫾈宀/"Lw)>ӲzzraPgZ%?1RY^4^*!ᅐs /ګ A t;G28tN/GZT?[*&|0RP% @S{SOfF .._uT?<邠8Z==0wwc \Jp\&]XF38Z1TB9J8ion0Z=TG28 6&@gh9,g֚N ɠP%j ƹ@#4ZVQ9c:*J@nO1B9U 0'4+Z 7w 8ǬP5T!Ԍ `JM\*˵M@BVZbJrU$QYL26 Q{z\:A2hcx,^'qQvZZc{9Oft ?\=m0hBt~u&Vǻ=<f8ToUaRݍo՚8H9Ǣլ#MT1ç']GzzW|40ŕYJOF'f7?]nc|>ʎ.:\@w8oMd-7-No.Wz8ξwb~yai4m(.Ix0v' !Ζ$$ x;D҉I%¡~YgND97O4Gʶk3m'oe;ҩVh~({B(B:Yy*_Ah&_#]"䣂w_oGd<_09Yi(,VdÕ㟆(rbV ?-=uA?.yyP>z \h|:L/I{V#it/z~}qow*Kn7q%?3bcML!u)|)cP{|2H>o?)73EOd'iX, A2tdeCWZ(RRhb#&!m8!- Ɉf1HT# _RِX͎RzrJ b;ɼX|)Y|ƇƲv!|ޡx5E˓S.vORG5\<'/99oT<4>f)}jf믆qd^p d~è#}dEVVPLhF'_9!(ˋ`Cj .G׳E.g(GXݨZR!o -ª I uw~v]E^PvҀ,N?$gU9{)k[G\=Ht݃DLwyUI0(D{DjI&=Q0C;JUe݆j*6?>~Z\ zpg$_`pLϖX-c!wioQ IBD'%8=rk'Bk{ rOd0(sXo1TJbRa'B1Ԕ`<+*(UHo1["Mԋ0>RLZEP7ԀI6tI'z%HDVVEo_짙 a Ǻj^5 ˆYsDGǂi Lr*tp\iøuVTáUhfHB8IB"Ch3h(ȹ!Xi|@2UHv 4Ht1Q$7Q8e{dԲO,֨:Kռb6 SRkeњUS W:.}裖Y7!kW`8cl[NDY&Iځ9J%?mԊ õI%Nᙈ[J,6`9' ADdQ'Li7{ R‡.(Ywvv]}~jxmx38> H_3FܧSMVoU@D倧طx tVcwҙ@ ݼ!?F+]xfny_Dk> dXm1X!+ul!ŸN%óz1qy_Js*_JcYH?  >$~KNXUHN@5)%;ofX}Ba!-~)%G`5O>- ngh:li<+GiUD:&r,o*T!iK_e(3_S,X:\dk ~vs^QBJ\hzZ8Э%:/(qBXG}9§Q6 J7b0uTOߺwDA$[=u^_Wr y֜HVCCrh;W cA|0"$w6JX` X"<&z,'FZο.آt;OP?9h@:%"vx X Ymp6lBk*U(nNn[]V਩~hE H&Of5Ӓ-r0jZihtP e=l81?@~~3Z*V-c% :8T2I)\܄to0^cx|u9 !Tzʚ$b+$Rҥy2 ?B?; Es8٫ ̒ hee5A7@&*# QKkW1M2]6T E6k18'E?*ئZ/l5r6GwNpI:LcHz²Lsbp?C[QX,Ue}{IVs^if% x@R$;="20 ~ nW3ᡁS# E[&);/!Yo,rFoB'u#ɟa@ӦB^t\GYeI[mp9)SZ J$HF.kKLU#4ۗU Mֽne(w׷"r-NV[eub>(9k󄧑DjCfK(޼*KPɝ_LKH DXS(e"$$V>:.V߻n dsz4VXdn>ߎ_d1*Cm%?Ϟ})Ἥ!穲?/ɦj| 9=HI$:P%M1Jy w1H1HF9A.2(N#Rq0DTNJ1Caoov#VBEvܺ J[/ŌtK } c)T$h&U^Dsirް-U~#7qɸ|%gL#KS|'15𼼫j!LJQ uId&1'6u2UNpdlAJO xmagDanhp9I( Հ-];(jb _+OF` ;rV_t*|l*@kY@na> Þ]̊pDH+y1>#`6Hm۠K޾Y~~Qz`<{Q}<(ss " Dw~thy,>HRH.qRZ}($ ]:¢HWt[.;kPtH Druಮn)jQ@襋RaD4"?{Gn_,%$meeɑ}ؒ=3=U$տ*VY/]V5ņ+^8GY7%7?`Gx9zfF(o7庳X~GKLp_G/!'XD!6&Rɓo7.x7}.dT$}`t{B 5jH$ % RR cH& roP;:!i4q&}FٽD}_<7~(0˿HGHT5!ȬrºcH28z̜̽%豙'A {FUZ^oA笒69ᗕJr18΅5-ݒL#8BǿrzƟ zʹ=o<^=jќfM, ]aZt3-5_.+C$g3/ dS! 3$P +p/o2T Eb 2#nߦM?zBz{bHrj#+x ]·ŏ=ܸ}{Yݼ{vg?o^7`>mz {#Z.V˚v9bv~[N]{~nmkC2*Lju`;C]^7!.{A`Sz쫪~W؍aحk_|s|z4j؍d<[~jl# fpTz_[HtOp8{=& #0T;PYn>F]ۮw#B}ں7/C!z6Y2_pmw | Q\}0nZ濾{LƦg<ǟMkľt {[/[`.t}sڋ$vZm =ExaHr":HN^'/i7݆bP":CGJ4fomx^YvCBN\Dd e foA#c1 Ar!:eMCtJVX3D  gs.?pSXh13>͗:ēgysG3z6'xf^^_A\[g?|Ra̓]?:.?^{n2oJ tt/or̡_ #:{Ye1:jWPyLo](\yfJSzg{i~'<dw0Zȏ>[޿=N-7ӭ@/ƗlD x_T HA o/N~MɣPDKwIS6T2!MrR(I $,^0|%0ZEYi#Da` Q?*MtŠ% K Ď`Y y#J|N軱p` ٙP;wAmwQ e y)lh1 & uuP* `6 nvVz[FVPKIϴ&c_ZnM>)Gd-#uy]A@n$V^֏ґ]&@(x't(rz!lxzJ10 p>HhђkjZ/P TS-83rS?=^W^y ^EIM2|o=UR>UvTb&Žwi+: ΪT]U>iWFFeGy@sPKQu-[EdqǢ'w-dˆy\}}Ym=RƏeNA[bqD 0QXQU]kg >(y6/U6VlBv? J^@!7q|^ :J8{H-g q /wYNc,.UOrGigt#Ziw%9ndjsaDM!fYB* :%NbzKnV-Qx[.l]J@I E:@ }h&tvo4I ArŋF4PMUV(#VRPm7ږ$cOJlOܓ )Xٗ:&n!#֗/GTFQqT3̊xTrL\[_Fꬂ˨ *dթka,eh,e$`\h0隿jLNZwN4BEV/f%`2J3/6E_)z'&?A'rZe$(Nœ̐Z vnBTdrB^M%q$&6qb~!uNJ8SpДSq)Ixp+$@ )uq׋5()BFI 2Ehjc_>3'8'n-nu4dSRS_:-rI0t\t(YH6gaH.2}(5a~j,5ӗB IQ2̥39z g]cPHՌ%sr{2I7ݛv׻otE2) GX'#$241E Mmi%(T@r$}͉bQ+]nNk+zչ?go+) ,#7"᭶Fk):)LH7)$Dƥ8o(U!vHHGqИwzJϼw8`!p9(1#-I9ԑ L1xRlfZ5灸gL6!9=2+Dg<. Ȏ촘:)R ԖujrnK=(>ٽ8_+™ֱ㭻-l߅{پCVH`_Y1jeCV~3vp?4|X.٘}J=h$*iP~N>ûrQ#*X9*9bz 1EvaPͺޤ`mѴ,xRa^ܢ(9x>R$CoAe¤,:7Q9Ms( (wPu7#M7=h̓ah竍EdwA&Skc\/9A\u0IFBc R,UKC-mI۠<ɗ@/ffؔܕ\4Uxm`kNͰ]ۀYUxT9vCK>*vXH2n/doDXH0LG֍TlkoJ}=w]踖VVԆdKQN:?84CBpl>Vᗟ'$wŭ}(5K%b߳ZwئN<$~XWU>^WCt/=OgL51\J'Lۋf{!BUא**o=Vy\5 1gDž#۷ʩYRup(0&Hi(Jq6ĮL?PCj4*wH`-$j t f\M 5'RaѰr3M-uӰ Tqz2 Y>~+_ 7ATA{>5|*՚g|zZúMg 2i|ěn bpף^ ?\Mt&I@v8C''~4Φ"Sm5xJV]E׺ȓmqZN6#;M)-;E Jϧb&8@Tj%~j >A'rN؍Lj |)3 ' f 댱Qk˪>9hQ-;(Y{#/F`V 2ذ .tUGʞڔ1ĔWnEX}gqG" {:㟵Kjr^'BV$bF\_"fJ80p*aU0yhѫ$R9uSdBMZT45Z\-J|@cKVjkAQ!6]EϿ!@doOrF!Gբd3c-x4rĦ./Xuw)B8qcvZwJ?PͶ ؑHk ;Z9*B%d!Ox2 2IHOluQ`##l;U¹cq\&&xPčG4jo|rtӔȋ'0+(rJpQwY@9Xw>$d4:gfaӝ@sXK%`]CHNvcⶳ6/\ei7kf鿾_;qm~h&|vwuħo+cbɹ_/C.>XT dݘuRDH DDϦxdɄqhA%ij{>?]>ŏm v\h}.bٻ6nl#lQ Ar8nm#iHFhn$Fy eFik`Ze}901DTIcٻՌ*O!q|°")iy,!EpZa)Č6Dn[)61QeZ`d} )7RQ0Y S,xw [ESR:=;SJ3Q:#5͎`o؈Ȼy_TĈL-c"ŧh!$; pѡҚ]ŹgM1!kyG!{R!'6s`LLS@T-@Ζnɰr4Kn8a<ަcxL *,(.9*"=%Vbf)w*R*[4r iU;tg  E8Xl>ޘ 8){1/( xM١.u7 +,(y1}R~肓S8y}QzEά9栏[<ؼFWْE> $a_Lx6Jb J+h^J=Iv3룬*$;+ /_ڰ3F!'G^-21+j=V~K_Q5s2'iʩA#ߛ ӯuWkLڮGQ[~?ژTqq)=A^QOж;nEQQýS&o9BciI),[).eTa,L@ApP0:瓩y2qڕԄ!vk( Υ JOqP3KdHyjkKZc7sòWMGip<Zg=DLX3D#KEX1ɼ68K30.MYs\4;!3>͈ 2<KA tsSOYFRe8!{8Ro`)UF, ')y4ijAX0A #!Î:JaW|<;SpJJj˄-41o< IJhKJS&dS:J}7SBOG[xd?·b.܉E7wNHvJRPiK A dkRej+4w]RR 1Q}MxM_\2mm4F+؜OGt jzBT .{}rCAo}Gu6}߁: %`/̸| .OȊjԗ05MКeDq]FF7O> ?c͎|0Npi'rJ;ŘDD!&ھRq Tl!zZ΢B6xf@<EW<tr7ذɽ|aop %-R.˗i=k(. Kvd[Tki;IWs!\pkUh'j\[Hp#ҌS?~[#_٘ EZ[_U%Hd4aȪgϰ22o|ZI -b_z#GKIFXB(nl9gFpW;2\!_8&'8(iBDDfxyy1B OkO:݅_;ȌҖ=cu:VmVҞ1Z9ˉ̓.Aw8ÍYmܧbLLpN.#J0AR9<##{V38D%0O9zXTv~*;-Ϙ "V9çG{B-.lJȡXrfw?c@P/Q)51@RWLHG|y)"V #pϧbPW%_ʃ1C7u馟]-00/'Pg5UgPS ݓS*[_*sǰW3۪ W uG?t_:"~)?|YR :H)v!l~u8{Ch eq0"~Zk 5"kl JfK)C˯c Kf{/{L7W5ےܛW3З$qTu&i/N3c5"Z>z~c޴oٞUӷÏmY`H1z]QΪ2z͗-H;1Fx X3Qɱa@X¤J FI2Bi\9q_?v'Z^Vkt{#pm4~0̨Gۍ^ZmR7z9(gkKnc+u^XS\pB^F :OG^Y QGu)f)I1YF4u Ҍsp_"Ke3l2iI.& kXfJ*^Y "C4 `ν:? ۿ/5ЉU zz.A[5Ρ㻇o_!GZ )o7_`D>gC0AD$&el >b *HX`zo!CJ1\nlx4y} e0*Aj:Ya1ûi`v@ f]Kޒ2|@FW&od(D#p8'V# ,85H"GXRSt}X.e,p𖽔 aD]C /*Ȇkϫ] fƛ͟Մc̀twB.-liQw՜kBD5D#y-hVs"iU4I.e.P`TY-)V¤?[E^[4B6.yPPY m#e /PnG?XA8I??qC:qCXYҎIbbD"esI AZjij TZÑCC0vx0y W^ NfqucnF P}hΌ6ov7Jk~ArSu}S UEw5^T`链ס}ס}],b {i:LAȬ-cԕz2 RԐ sˬ"|FVwZVw-r,!kD\o' ~47,?eeNJVYAY2B%9̙?t&M9 (7|u]|*>7|cf(5!͐s?bA%ˌ" ra♴4E^gYZ>թLm I $ 24ʂ˜BzN`i2Q%!‘P*7|9|.bZ#':b5̅CSLKnBд| ŽR%,' '^}E\'B_fEI>͢ 0b(S $۽(;mOWtЅ|# ќ"(VoCn|cHYԘ%s|O5}Sd83tybݸ(}An <3kdc.ɶ_.DF|eQH"sr29,7Ƣ@b(D("!aaC!4 {c؉ R2U/#1cmo}qyScVsC=Z x`u`u`u`u))X[3L)ĂLY9I=z b2z5@V/?ʇB'F*,!Ԡ0f}HPaAK)uGAO;`#Fd>ZEހVzr&WGvE9=c20UL{HMrsG$2TS4)(sPڧ‡%)me$ա_ljLbT9e`8N,7"Aq+%F^V0)9y5$WSQq^ܩ!(Wn† 11$Y.Ə8 xFP>(T'Ső[ B/%;dZ3#g ֔p@'4qб \h*Z9B7!QpNQ#xwJ*nCF44i[#)e,M"*aaޕHrRke7̃v4݅e=Դ{57XUݝu3YRAQe&_AcD-tIR^K92 [V)4@/H Tֆ`6F d|a[3#\1$AvZȬeJ5̻8,35SDGH&!D@^"f._*2!OX EZ{LcVD22Q2&W\ȎXr t^).i"ra.X&[_O8YhkvaZH1XŘ ^w}=ԨkdY*,er1lѦ蚄?uQ0*t=gId.p˼c'8 [H<卪o\ٹz[ 8 Nz36|F]tq"(Bd*B!UQ,aQ̈BP/蹎n7T嵺_ (H c j˽_31i /ဠrCv֛Jx*碨IM2 [VG~SYE+!5*Nh-0wS9 :0h4Ti9-A0E3j!ը{|XWeTeT[xd!|B{|r%q1ȴ;Fx51CWʂvtr~|,xwi0GB`̂wxRxl̲h_y <1h_ԣY#AE6gh ,FO!Y+UTFR 1?Ok=O1HˎuxpJudC#=o< @k3`Bumݎ#É IPm׫C ]X66)pkADX.6hkThˬT"FH`OycBMBM&ho .FB`J @+l6|;OuxIUbe&#,e铙'A~wvWlv8DXyK~te'q\ye,MUѣzy5pmeLC W8` &8ݾm:C3Jqn`Km-Xvm|aٶ̤K5%;}ofyޫ~mOSb*22#!gmQRI le1'$nFH6=7M E9lI9_MT!BA+ ||oW62.'2ݻ9,߳.~W~Gz"Fb?__]um_Ǎ/A_n/6gZK4eƷ⯫7zsCRR5$ dnţAr@5D#?|T^M xVƬ=vࠪPFדC8$q褨͔GI/9+Z.0nٜK'3s%UA5+lr#v_n.m(_#(+^ Q .," g>hd|G6,ff,lt<>R)GI,EldU8ep"SF#," 8,ݱr2/ ~aAF͚ӴWY9z iBރ< iVUl %UKVwkCƧs3Ғ>M;D4Y ~(3D3 hlk fl gVߴuaD$jOx̡zދ/(KzF&\)m>Z%ԻhT~ 1+ 2IδT/lxOyS}ތrPL ژ@IL$FQs2^J|`k RpE~1;Fβ.md>N8>ܥkq:?fR}67p dܸ컹pi ?>ٶ뤾ܸdճgy2EΟ %!=XlLn{n~N"H(p g1Ijb&Fm-Jd<6rVX'_N}9N^,r0LjV)TҸI > Pq(DN\iP[B Áf[6Qz|)9)2 ArΔ︖ uQ3(*h`S#Q;P1(DOYpTZ#4uDK)a`wCyBi7 )>-`O[6_>/)]WɍV; u%?is\1z$*i{ca/a0kga)Yf7L#Cf<@]/w##-Ӿu"I0WV8>8o3P7?=>g'I(/J!3"jd?`|y-V}^᯴=!;ᯋ%h;7 On-)*x~yrϊ9(sor&֜@5S(gD_s87=ҎVj艫FZC3׊P=:r5CQƦ*"s9ʃ>HB 7"~''{="9O< 4|ΛçIc A6.aUiQ:r#QSNs ,~WPk0 }N//0FD:Kc `Pk'sm`a\Z˨,rP÷5V VS q-HIxHD`xRy)ԸҦ5/5$h<jh# '6TԻ2=l jEڈ=,FTOh&-vb8̲4-'5\J DMM\ jX%3FNAf>J^>TVljwfDjYKqD-5K4Se 2 *ȋ(hZ!bFYxٌ~P_¤l5*255Ze4`utL K GD)q-X.Mg(5zHS "O0ykh6um 5(ȶF<7;]VxI4Q`=b B WaMj)gMjSϘw2*TGmA43<.m*e9v)+RS ZI-uHk3 BHjM7QdrU(t|QU tju3:(uJ"m?<8nm8MK,mw iXD@t.NQ9EHDI]Rs8U3LFѐoIڗJxJTh9Jyv Jx.G(n"Z|E\yʕIK*Լ,D Tq9UჷGb2'Tz5rEz9yK T\H8B)o_98owᣲ>9XQƜp; g,&F޼(.4cA!Ѧw{5i L";dlHgO;!8Od7ݳ72WbK{W9؊=f(濁ޙN0 ]ܻ#̞pN9 `JtP>1!3L28 tK&cݻ }J onu+,}lPQ#a*jZR(ShF* ~d}kĞ};_!#-ix [o:F"5#8&j}sŮٽ.6B\M+Qgcr8Bm/cf=C_Wʊْ SX8`6}}3lK{t#/Qz85Sm0241(1ndl/v3%`?=+FGr<[4`+fkEsl`0)9qrJW YDz@aY8 \h}TY6vlTHaX'kZf9ŔQ[Rv})* ? b{t\*MlN$=-`J179>Iz|hnFydN ۚV?{ȑ_KpAwu0 ;w=ʫ%jm'O5IIC#g»"GUzdL &GMkuD;m8B,%1J0萘/>Zbp{r#Vl쪞U=9'ܪ9=Ut4͋߁(Z&[i O:Y\h ݭ#N4hT0r$b8/ɊG>ޔ+д=8rΌ} a4QTTQl3D@۔*E k˔b-L7q-V`D(PS4l^D{5mRv 2n%oh;+h bo{F7P 6+U t)sNR{LyuӊIVF늆ht*؂F+gޅRrTaz{=a~lwE@?Բ9ʹ̃!$kdSE)ط{D@)1zoXB?Pur>c`ItQ/ٝI3Jz:rIRr_*ߩD B٭=Mi1lz2)f`m//3; }o&}ң3K IjxcZW]y""b YzF$~R,׏b 0T;+D`Pskk[lVS QH(0A dRfq`;l v'>,q$gSs0i?d){ |wF{vYPRM9{"ԕVkU<  Zo~۩ѷ;BAY6xOHwgӪhԲƁIJo.ސ`! vT1C;$f)+d_5 !(ؾ\1% Cw4 \PFR|v:Hb@dqmhۣ Ư%|#.?;XDMޥ~=.MR%forg' tzg {KCNW~{-s[т1p̾$w+ |(e\2#K{2xSy=of6w8F8IS'ꨤ`%R(c32ivqo(=to2y|Gm,#Qma#YdB'yMk M{~ ўy ўy3DRX|T " 4'x>T*aEe Vk-j&9+҅xtP<&/HÜL>{jtk:9ę:]sWеv9h}Rh?hԥi*=ZF^R^֫wD3 :B/NK䎂ezn@jVG`0mq"9o5KMC4/]RCp(k=-V&CQg=H2kXA$Z? \GL#pp0Y ^2lNXt\ݮw3үumS>#mlmT3C,\Mh4k97ߧw ͞^~ t~xp?N?iJ~NA&LPDN}T{jN9כw?}_ګ;y nCXR;6\mmu[(F;':j=!rQlq޸8Cuwf pd12@}/\ܗ@sRF3CpUtoA} !Jm!ަL :ND:l9U 6尠s{i,Dۘg#²JxcH"5S%N?}?OKy7XŕTBQp*BۛyCEfƚHz0ϻXFHVQhdv!`b^iUwOm YoyM$5R,|D+UTłUib|DG;[OZZ;Ylff46K-VFo[qtƑ̔ 2i6GMӫDZvBp3E6U7슃7=?zmNslT="¦}]-,տ@~j[gk ̒,d{cZB􃻿>,7ߺh10*8 w+t5p:>#+o&~-of}r!8 ~ nN4{ObWh ?j9/?#>YB7657^>5ڴ7~f;\crVG?l]0-МYZtͣbcrHICxX闢J )J3S[4{w4ʴ3-LVz=}?:N Zw f N`cs8])F5 VFD-#͖=REH g1Gրf>nA (RqZ`D+,0Rkg wf(/jNC)3jNjeXICh$lN/QC `{~=[Z u]^qmHX ދTF],M[f%Սķ'u;$}yp}h],i(qWJ9bAsӆI/aӽAfyoJzyܧ>gc1Wz-ŬLєӬ&\i,8䝳hO)2t3 ]CV)]tۥ5JCKr'S!EWx*@Lٛx+7 k3{E<)i– o`zymJG}ޣf/=[F{Dx5md0|S C`C LK Q\4:Q#^BƤ4(.6dԖO!Bt69˯Xu!  tsd,Zq$+S6NS=呰˄JM"{:Y ywlY/o$82 h1!z Q)`3=o`?6`X|˥AojdN=B)G)8U7Nq25% Waq;1S\ 9d*GOr}# ,=AafۆPc6MO݋$u(5•FЗ2 ; 9#l)Lbl/ev 7{s 9q9Qht&-/u~_)?SA>܁X.ץwAo",rNW9ߒ_o_K tkezl=B33A+bp3hG@u+TovXRxy@,' x|Z}_r$(K_(ʺyEŌQh)qő6Gδ/iPY2\Lȍ^]}Rh 2z(.gE,53S2al!|GӠcYփ]dZf}r lB qkr/ 0yP[hϙ?r UMp h& "U4H QA` *V\얐Z3ƐUY}WX } 42%˱ l= ߅!a|;U·~ 57 5 k$S[Gnqi=W8;'}ܡ-[(&(d:H.xtVFLކnì=`RLI3t q ݫnpt,_,h9P="h;97:y{O@`  ;*<:8N {¨A}I Awvyg]X1 @Oq~VGN9Ӿ ro-U?bm U`  U;{<ܹ SXEa-I3hL<_26(AQx"-s<ڶ Z͎Lӎ [M0O~3SO4k'i^NgXGpHLiGsr.Xc89snu%u~r{w>cBteA[^wQ~CyWsSٓ4yk2Ʉ}^`eP=NzOF_(==9:=cI}'/>`Qjcfv 2Bt:#nN\_G~: 趜w߇[b--aqwVڋϦdk,5Mo}yQ?O=Gޣ~7\4C:b|d"W.8'UzJ@ɢ  ^PJ={ZOoGeXgutlD"yu,.f#bsMx0`:Ok{b6 ]ۈ5 wįcus,SoGI` 鞔Ӈ *o=/ᄋ7jdZFKy4徉<5&Dp^o"`MZ UUDi+K!WbXҧWRWdtőeu2(]'[ѤW$~楚ad)}9˥4`)c* KMɟHf Ўuc;qL9$7UmiL{ןKxUxU)?lPkW_⫊'1@حtC꺹Wb dw`-b5H$_$>4ϕ<էsri.IK+&[T2[YW?1C"EC%'q38q_ ,g)w7 2kK ԵqjP\LX6 davdxoA{d`bl*Gc3נ҂5'a.z YUyz΢U)wKG(A\+e>I .2V{J7_iKyz}Oz wwtLڇobyR2_~\wg ]0Y/:__~lTMg;>LOM?#(An.IsBy6'秆5Ffxh$5o./ig6Q9gٲA4 (8 V+e:iwN9'ty9?t? ZL D^r3DKvGJ8L `A2D?N&&S iuR/N-&|A:j\^2+-DWVΥ GdD++%swglZ͌l!Jn|oTX"nE&/A}J/6-3}N~,u}" G4dp6RU&b<*pF٢Rhj9ƽm{_OoX^K&-L #p&-YJѣ-ij>.VN_J< ʑ?; "Yê]NV:@zs^"=2y|K ͊d,e؜5{~/2w8Ee K_]VGE*h@Ţg5J[ajoԨ]:Q']HޖQr6wN'֪ wn{XWr8HqXp]!负E)B%Qgk?׋lYT UÛ7TlUNV䌾~_"8Z+E_Y/8X_^@5k=6}2)+M޴S =o"A> ЄFwԅrm4tVYϺaA+Ky,o_ogȗӛYڹHtWx&F2m,;ryѶ/1T>sRbϜ1jм7_4`a0wΝ#9_74T130I>Sr ePA P9'Dy+޳I \q{)E. Ϸ;ދ|ϖ9^>T~8c͞ @IsFڧek5 6O-#g5k̻ en=U"^-w~#Vϱ~u{ܢ4@M<rF"pL7:>No>M.?-G$'1 onsqgz8_!lEJdmI8K&R俟!% )Qb1PNW}U]DxC"櫥/5$lGg_|2[M.-̵Sm]|JShKaIiy h% L׋6!i]Mӥ }{:e=q0p;p᭙]wABimH-"8\(pɱ5:M_}JQ7ӴpZ3RRc,qʂO G>]4򧌿4Lɽ w}4֯E3|3%dDRY =31soVIқxr}bf`՞׀ELΆ{,/%5ʩ m,Bbmz)g"FqzrD9㻨)0PssDe`d+uyކ0Ddӌԣ=ҎMSH{*y뵈UOvj6eY¿?n%YmeP- 6J%dzo˧s!zb("\f iCT v>ٚ1ҹgT̕d BiȮLJ eu Pu+cy9HB5왛2eF;r.[a"d siQff0Q`L`2ϽtEnS 3LU~#!aRӰG.F!a֖3Ol@BüB/FC)A=uHM*IE[#*۪9ňmKbݖNa[2^T֪6O^k2k+ k8]NA2Û˧*L?m2 F*;_^FBV?=\eN{E JЦ'Oc[l:a[Jc^R"H˦M5+"J %7{!Y?NdIYH")oe yOr޹&ϼ«{󟿼.;\;f;3 ^`Sv׫˒B__Cecucy:gB-c8HHK(˜QA:uax#(I}h0E=?lBxaiAk#vڋNx(Driuu(N^l*tz*PAey4dA lAK,bHsf Wב`R$YN'2q[pP{jm$th|uSU*)C X敭B*Ӣ}\s}mWrv!wαc?9~rs\MkLI N! D:*vBDx[`2 ҝ&n?; ̌8ߗvty73i'] 3f:|40cv@A@:Z)a\(=BJ ͝<嵙K ևhA y$L1ry$B94Xy2XsRh|)xs= ^M2]yBɧu3bWN7i7Mh6}2av=4i%ț`~n )js3ps@*yaȻpIpN?}\r>1!E}~1'/: 1Z“0}S2*G2`=RJ8  }݁j|kZ+긘*Jpe\`]8}A#- ,^'7Kr7N|B͙'0SI,Xͬ>NS1l9].G"`4J+K F!4LSp!Nl'\emVdO|= 9nVZZ LZ+f幝KnˏV#nvi}0/:~<>O'wz;&8kV68;]k k5VAPFzU@t}u֐Bs:7 Wo̮WK֥۴o]?`(0\Gj?IbHz"144 rbMV0(ܚrڢCK\Vjin5`^: w}@KF{ntILt~Ü*_qpc%伔Q2Ij1'*b@4npɐ̪ ,ޞjתՑ(x\S [dd:vE|F\&S&vQJ r8je5XK3U\^> yS]T#;8^`=?8y?(J2O܅K ~znCD9|hHC^)^8y{պ1:V2Sw*m!ȴ*ݶuhYքpS`'%Bur c aMt]AcT1{ULNLtmF˚&4䅫h 7g[%="w"a0Io;Uieo`'XB.x:)bkD 77üSL|@2Jtf!cZx+{2oXǼ(ɼ @ݪHf$f,`c$8 htyM49n ̻kB@RVyӶ%Gi#at D$X$RcfԇRM([V# F ㌿LǓ|o6_y6{3$tM޳f=ߍE_w}fѻyOm?~q;֝<|H7?*CQ"(|ȃH OG&w~z,Ml\zDUr>+!Vǝ n&)C"8(a(D!**u.I( 1(0[eIfG%r*#P LQ0h%dGvZIT KYhM4>2a:e;w,h"#DEI#qFnw- jonTk\CJH"az>D8R\bZ).;jcS=űb G}eqjRiO-= (!bE=t)z]yIeL%Z\+>R1RU+@xI~CƷ&sa8ę§T!ǿGdVPaS)wguHp \fupX4CcGhEg}j VmO%jEItՄ|umP\[ЊfnְbzyVۭC,k4SE)MEJiƴ.f!tHE79rPyh̳`"qm3lʫ 1ohyH3:TIf\Ȑ6 ϔ3_ M0 iymD`RCdo!f^j\))S`̗yC I[iPȘ,S EEeg`^.&:m͈@>˳ [̆3VeްH|h.b{9/W֒׏{{jt1R#%f: lP>xy 7HD"{MLi[(!DCw <0)VcaGip}(EpZ\&/5$]2ŝ.-z F:iV[X}Χfi0YJsy ͥJQgz븑_2 oECg0̼l c]"Ɏ-#}n\lmWbJذ5KAxB֬dKcJ}H $'nM>1SZ,ڜ}9^ ˄[uw.T3˕brӫ nE JŒirRl{Һ߇RN}WܼU)a9#:Rx)-2.p(˔]e?VaKx*+oH;Ss粥{95t~%K^Vec% 7{|g3w-EP#7tjXRh9(_łf+?S?sww^]b4u?J ([{d{TJ 6EѨupRe>NBjp<עqL{zLܗ^{kG6Uo+jbk1K΀==:E0;F% Ϲn7"_G'xLގ^[q16է򄮖b"tl]KއZ"m&w/z~J?l$U{]mHpXF˂՗&-KeҌm/A*&ǎ -Dg uhDR5GQ?HmXk}R9cBJtXX>n>a6S!틼^q_ZP(.߲N%Gf2Y)dry{w{}29&[ BwwpL%uF#*jbe)wg(i4Osp- _*c3`NiZϖcJkK٠Ȇ1mh@r9ߥs57ɽGCY ɵr[,Q0`b'e(8Qh"glb663 r5Db"R-5UB˪7s reY3-puZ>;Px5KNe<&k#`Yʢurژ 滵2Tv&) 'R$XhS$EܵbNlIEI{߄(|*!QR2-䲎7U3Ies13gELr_Um1|dp1s Y5_+fL)iU]̬ tem`G@ 3qe9bk3Ùàb,RbdP0]y ʗ]u` qBcw{0H6!He]dlU.8##Zm췻iBwIXP vj"d fzd*4`Y2psY(ǃ XWO4FmƧMNwTA>R_Ofӕw쥻OK%eb__&v-h{ HVEI);Ϲg&s9Og&p/.9ޗv|0^jl8mЌ}me<k^V-ş\<8=A^"#g{B+J %&^ _|A&1nV^urR&õN{sgǵ}‚dF/4q뗌5IoL,tYhPYx [~ ^z^^`Wxh{%|^\yq}ܿ(…c*W-Iu`[RbT3tFr_aCM[?}H{O yIhKB{^~B2@@Ù\锬gNf<8/g\VQKIU&h}T^5cbrQO횟 ViǔSFx_y'ź.~xy)䇫fsa[?+7^뜥w>y`uګuiΞOp"A<1d:K//)M77IZIbi+xn̪SlmjBͲT\I$W5,g癦<ށ8fDcL E/腡H:4N1<M/[RvMf4Șpm6ƙk0V$~8Y .Fސ4LHiwHބ=ּa[V[?x Fe>j= \?WՏ (x vm>Gh hu@7^=;~4gDp!Q³MWϳ5YߧŧIase@?jU ~|׋?o-^׋w==٤BӻU? wrUX>ὫH1 Spxh7?wen)>2F6v}q{b@(zwKĴ|DAu$")_#8Ggiޤ 4slJHPS !"nF?h ޻RO7cl>\]/>S IZg'u*2v0 @G2ƣ$;cʢf#3Zo9y~] Qكz.Y! ј wwQ+؊@C٠3*x<MQ7*qh5ިJct/)lJ\&el n0 Nqܗ}|Lҗias+y].̂!ݽ}{iMp`gE8n6pe).WyG1tTmGA6 e#N kd>dX~7o 4'O qV G:h<hCGGbhi~+Ë߬(}ie;r+zECAc1젝T, tbvcF+NhXaUpƿn̼ƆjeF1h`)cN2d6hmb9&xde<2LiIs3^(Ey켓Uuߔ.̤!v3ەj2/CF0-N;c ڟE^_Z? Mš}}I%\o|%P6\ä(IisR;]iZ~&ܻ*4jwbZEZ?-x*Ac|n5؋ Bi7^t-05gnv !(8b+3cj,S(:ү`(ϳ #D\'6jRҊ<_UJzދiX^ۗwV 3F u< :w]bkE󕦴LAPA+P& <'A)-MRdD6`&Ѻ0LUeߐ@ "HrW48*9Mۘ X #'W9(!05kNsX.h`fFb4`i;񚹰 xaFeDadFړVS)غF%MF3X{̬Bmj()k%3 c4}9^J8uLTPʡV72nu mwȅIv4`i@*@Pp-iBQHiTR`1z]^'ːA(]U5ASH cs!iVpZ[&@/#J)CMQ(ݬ{OT#Ä|cDZ }T!L~f[_V(6[vRÜ@0 &?/IW-޻71yZj~K$O~u6/JHGfݼGk!]avAP$19>6k?w|ߎB[t=dس Z@̇oBdeRpDVL fLM&BZ9rWLI@msǍ,=-uӈ9:碊鮘5osW)lRZ=A`Й_k1'rfAI%&Hނ,p (l'ߧʐ0*y`UЋewS!6=fӫAlu1! uΔIhiZ&Z>Hp }LF!Nr YJ%E`MQIׯԩyv64ֆ. Yc\٣fٮ{s豠 GwPCYo =w*/ȇasFk7=81#IB/Ƙ`m_TskHj}Ih0k"s4І,D@!$?WI~\% FEOQ/cYEg(_?/ xq|n-]>̃ 1ft棁0e6s0*+} ĩv×h";Ggd,Di.g|2m.{-dR5&o}qŞ絶6 'ƅ 35RWQV *Ր9PJRu3Sy!tP7/߼ :8m 6rY82LP J$JTTb1zKQ Byz V$AU8AJzK1|0 *(幖T3%^#icbqDUS޻8{Q5z.$JYe" T)U(|t)'Zs$ ɓ}~dā:Vvֺ@C0C"{k*kCVlTMҟðTM*% ;\=C@J 7gG5k* "9TtDs$a L Nt S]X\"bWBiTHrcY[8eHbDuBWKTߤZW;n+̟bR(L8g8lno29Z}O>͜x<J j/ )ayǓ<+3Ջ>"M)QLw\y[ZQyo}A_5|nA_eg^CXO*7=Ĕr&.Gpe<ϲ@~_ʭTC ӊMoUOwd4TG_knӡm=:" 6u5kJQw[yRvkBBr-)|rO@5}4gM+}ji@xO3ys3oCI%Y}%CC sU(jaȌE c&2s,hm0^Gb@"DC $Vڐu il^c&p. T"q`m"Zƈ(s>`DxIHPNJ!u,6b`r+W^1^ g FQ2AxDK 'Iҍ 3܌w äi0zpH>JWU`X1Ǎξ_V>I`lOXL*~?vO 9Tl *w3s~~;_ S&نLhx<v9 ^*K 0 C0i4Ja`-qZX-NϜ5(l7O"La,B^9աQhE cǻ6VrN&ejC6AԆjC;$)uH Z־ Z*ZOk;_ʑJ7e,AbI; ` vNHi=Y[࠙aHGob5~A5رh=5 ,+T!}OLaxt*b.3x1׋/nX9;tdnbwXL?Ę|Zptmxߢ`=#v+t' WigÍɿQ8JxCZ6c*hgb2>q⛏kJy/)gM~gN* Uÿ5w3v蒪 otաZ#piJkΩ8l"Ae$ur?ӗ2rAni> Pkyg]Dz 4Lqۦlև2l^Ƈc琎C/2MMt[T~2yӴ"PMcӾ:̼K^l!/ۯ?=HEy!NV Q_gk£_Q{[02QdY N\|d=c| R$λO48vuܵ"Bt8^b |b %$wlV9QW9g㣐WJRݛ*VQiG ]id''0 hD%Tv}5lg+;Ңc||dnq x}.`.Dhz3vG0æ] ' *YpWɂJUՂ[TPY̭6ATtL! QJp*0XH1I`kd8%K`qי*5+t9XV7g״Keu٧Ym?X> r!w|xXWIc]%u4UUcUN{iXVi) sHt GN|LBuJT0C"[>J*Vp De[~sh nsvύ@"p(k)ψ0wRQ!mFSy,wKfX\w>E%TǷ2m=- зոb<,BQEq+T`3idE6{Xb D NpO1$Zf*iwH\C+R}i'nĞ,Ln8 ж[E[f<%# AVL-V)*qXu"]eok ҁ 5o+z/[| ]@jk@G3Ju`&~֬ ?WD9zMc)YKEo-V\x">rzx'U{q"TRj#zB:(+!jqA2JIZp+zPa877{V-/M0=~/\`Da LT 8ۊ{'q"{'YsqVT@\=Q!o#v"CJYǜ0cyETl ͥf#+817.C@*Ǭ7Bs2#*Rڎp,8 B1g(XK^(↸Vukܠy44 ou2N>B`S$n!jN.n! ))ebs#* $[#\l5@1`# Sd$.d) sxwT`f hbe,xzWz|[**hc|gǽp %Ju{sTArPк5ȏ/#ҩ][HG9-JswNa+0%ämvR駤,)*ܤ %U('>h|sn ds9\;*ǎ-d e Lzۢkr|NO R,kP188+.n6И?{WƑ `R\[7q2P[[KBRv=ERRsf/l4 /bwNT ҆]8Ø3*/U{%d7zTUy9Nqgts,_Q3\F qpg'ʥ[,߁@vV?vؗQtgҒ؟ϼ!J3oeda *PHnz T5۴DOrve2P/0#,(.3! H`Sfj~ KGn#a .n\ߍ'=D i YA&T=?.6o>6yj"Cz:+!8FO2f`Yͺjycdnnk^!xZ>;mcr!;4rKR.U XPx%ĵ?ULhCQz] "q?ߚ9nPJ~J$2Gt.4A{ɔ2㹀)mcY=R&تPDNFe#Tv7ﺰkoO 6X-tآ^3ޙ>1/?WI,:[XS&u4Oig%~3I3;w{gt}R5BMwQږD9GP(N7f\NhvU5#~~!փ$W);ӢnO!# KܕNy+]`#ytc w=l7⯐Zw"9yݧ5NeC 5 Ɉq*cLes8H4[ [|?RXˏFt{ď?94걳Bmjyx2/10jV;vfK.E>ak$- svm7P3ӆ 0aڸ0m|:)^ ;R|)ƊO4ho;6grLhWB2*q=uTt4UNB^HiO*h!)]9H_-'B¾*I_%fԒRy1IQ? 9j6{SߑO B݇6Od~0J|6dayb7:PiiOq!Tǀ$-DrV.]hUO~2I vulśF<*DɎ2\e؏p١v%2ryĈtn1eJʹycI”™p~QrJֹCjǔ^),bgs"R ^PK5t}e`UuKt&tqi=~N!MtLj̡wPhaOR$E ap`p-,?tR'títs6|tĿ/`<(IXn p9.g3q׽) '^ƿ~r^tyI#g㛁 OlOB>Iw;]pWU|0gwY$'N9Xؖcb4Y4Hơ\ SZ &Լ>易B__.G#X`}b@G꽟hq'}/Lg?霽u(Ly0 |_ K U"q 3 ]kচ2z߽_pc aJOo.FWfy7x̃ "'Vrl%XQ"I6פ &|@U .tA]Aܹ)UJ}_!C?*x#ro7`׫~GYv!7 oC2O|xps1E>z5]΍O>xRs7ꗇ :{vo(LEԸ?M AEI:(Mv83jaT@9iDU?* 5CUR|x:)c%kYFVk0ךuR Ѓx+x[# rel1&{將xG _R6A:ɦT0z,<2TftE(7|& -1̜fP*hN*/7'+ +n -(1@,,UĬ~GIRe4@T*$7Hx aKf<*EsiA8(0]cA*@ b,i0wݠ\I5r dT\|;wq6sQB͇Oˑ @4EXl}$ǿ&?Pb /oZN.!3*i17v0׬j%p{)K]|O;#!|:-huoy[gv=~ƽQH|'/=5'k?&T$[zjnu{CmFyf-LHfQxn| 51~2i)OVwW 3 &V#@ߍV{`FqL  dIYBm *j32=O}Wy󹫼<RYk6$_DgX /ph˧y3GZl@PC6V@ B_0L|w1k<9W@S#8ˆkV"ϔS-dr# HhuV =fBGCZ2"˘ 4SD,P ڰHV`.QƍݫsϱT"Xq "ǑH TrJX$ |M_QZ6*~: 6ֽ $AkXe 䬐:yܹlT5aս_@pŎ=[ϟO"ن9Q8 Jٸ#%ru^kInE˟ky$Îps r',lpM!w0<5ћ0]ۇI܌CDswߟ~DYÇ5Fqin9*X6F>~:'ƧwOFv-=:ז?&Dty4̼A'ưst ᝟-Լ .3kynpt5 TByW7*ǭ++xŹи%*arc\8m&OU3TqL) FOۢOMcş,T6*& B'pungUX=za36 'ҽBPBx!xP~P0})rr̘p7DI`T xk{WDA(ѿO9 2sq^gK r;;H&¸ٷPm)/7PnCnqh甌QTbČ s>=1>"+?ݕ[-6坌z|t6=xVCRW[ ^Ьٗ\l NF/͚qT3Uip k^}O7HyEjƟ6NxttnalruF"8=TaAgI_ I=cCQH@3GP}O/Yn`5ԂC_4YMݿNudXl?ms )#lo8yƴ$kS qBiL A$̑|̺ ^8VXK@`#> 堤 ڑ\c:P!,ދ`q?kX>R42-w 8>wؤ~ܺx'=Qw*!]@zفR^5y]@ZS ~Ax[EQD噰 XYo1s5P%&3ezmw|$%ۜ,sĿm^{RpN}ڣJӻDߒsCACRoߧG$'nxh$ĈMHrHK͇^-F q>TyXK1[uՏ[_5b2+ uIڬ_I*ڝ_&\\6PU-@R^D+9AF4oJy uqhk"h5DRٔ3ń:{YIzS%Ri~|m[(aBy1.uB??bk2)ŰLiN7>-˶w}I _Lb1n{Oʷ`˲|d!k__^lr1Nה$)1#V.<->D]\e+,}(eYR>f0P$EKr)sI(ˍY(-&E"Zb*% - ArƂi+Q/@3iDN *SKyGhA"Z0MB+Xˣ(Ş\*alsFS 03t(8^6 3hD%-{'WsVPO (A@+r D0 1%pcs[21?Rmdr/a89M+cDqx}w?]EexX~*8S[UO@|\}nw w=7ϩbZ263ʘ/vOOAv_g3+Nlx0MxfjAq.Imri,Naj !K^DOGiI=};UkPMEoՊEb ml΢,zZ*! [Sh"zBs@{xFNLc[mLKBy{+kIQ̔t{Mi o?j81T0MtsZP*YIP@1zus@uTt'[Z xM^voۚ8R7jb!!\GӚe.h+jPYKF՞jC$+7 ÉsSg H:AŢ>[뉴JtPvT;E6 5:AYr#yNPU>BIfcŚ9uM"fo~9*iB,8ks*B 4&, 厂C: #M="_̢Q(qVXMIoeҜhM6 \T8E>{@DVRyNu 9ySmPEKhTES"k$`sU+b_/_KX,Wrw/o=bsɓ # k&W[lB ^ϟe; LHPv wp&2a[RX4[8H襔_*?&VϢh{\Ob1:ץ[ͰF ,EW:fm9z'rX/8 }5!a} iC׳n7xL0Z+Ͽ}cRSvl;bRSƨĪ=[ۥi'qU2=% Of1doௗa3qOZU1xfl@ĺD1n?O`#nxoɍVOdȮG%QEFš5iP{+Q-&5aws:يdAcۂ RJW;hdWh#pҳ84{4kQpF;1SCR$ L t5QN'"( k{ؙﭩT2su{ J嵸lS$P\+=d80gЊ{=x`b*?vjUiFOiyĻjt-뵯9}#_%F+m7]~4Zq ? u-cg ~z]W]߱H(* GՊ@%SxX ؔiM$v>[|33\}󝩟mFC mMt@Tc}lc2މ)~ [XɬN,#> {5bU#H$Tcv}̇ڞJ G<nzꓽ{Ƚryrg3!δy.A UYd>ch%8eŗd3E*1g?{Tg7o꣟?s;^ѿ}HMbюw=E}ϫqg6! 0FE:&)8th ؐyҹoEG^PH/$a;Gԇ_ +wοq M/;Vc78 J 0&rC}p9PpSľw t";Qəs1)YH SKg>4hN$NRrw#(_ z١ tsnN!#\A9ck&7\^Y)PFR\KIz]D*YR`c J;˄#Д)b bBYA M =AP(49~@z%P5OiV3!uO;[NZӚ-=*hS%IhԴf[Wr .fZڝ5B ~(~& bƺFP^q$cZ)s;%h$WqeR  !d;p4sL3lA3H!#ER|yM+ޠ/L? SNIำbRsI7w^p i{VJHy1GOkOf}b*觍0opZġ8EFDF)Rp~7 4dsH t~7I#>0*O%E}f+r"gx4s~jI/'͎"-#É>Ե9NDR^)}teeRKm ڑr~>i`q:Fg&)υ%օrTnypqoDo4Qj.) W_oOl dbmDS<$xa{aJlJM&6%"-ܕۄaMs5i;` zzfS!B. #is<6q髏ǧZ8񴜨"0¨UA}<1C 6ɉz[PW uDZл?%$H9xQk%6s*٧ޏ;SvMC4[(q4wNNtNdԋũl` odKSp{rð$a'BV5A?ul?d"*/|_qhZY#购N{OY]dL'ܿ==?NVuQ& N?gcDeLcw=nH %-,HȇC 6/9 ֶD.߯(ig)B2C#ɼfW.K Iʧ*դTu>[gly}CX*d62#FpO.c=zI8 6& ~k WS{yoC~Y_쾒">.rn-W6VRڶm,jZl<3Nk.^';48PQ-Y T2u=Ms]5'cVGa xIJIF_s0I !(s.r,٢K \-@a-V@4BDLN٩ԪqehRRJϠRB)LpV}Ruî EK掀, H jRf0{kV'eplV/ :T^ ٝ3z+U'jDK AGdfߗ3b}:a\-wYnh}1ǛrnqnnEWn{ZݫOnf_m~oնtGδEC&=9H:QhT-:P C4 Zk{E{Fh1isZd42F&m߷Z*QtHl<[ ' :S h-Az|⥹B7َvzzhMyg-&C.x%ޞҕSSOb zRh΅lC)K]y]= ;.]9\} ye(Qq{:f>8Z[mj.q% ݻ;[9 lJ;6:ldP`3C+@t r>O'w3뙸q~UnFpq^Iy9Ed(o@HhżK҇)[_yr9J EǍM_B?!0| vKA:[5Jxw?1տim,;pJgw;6 E{ &}YUS/y惖vM,ScLFomkeZlQe~+= Y7}zex[-+Y]yluyÎXOW,+iIy# Ru͙+P/m%3I57NbdrFF"RաYԜ̈́3wt[%aC5޾FѾy9ʹV1<,@ V)IXLȉ{r5WO8ehε@,m( " x W[j!-^>q[Vw  ^Vf)_b \G"sZ:< 4PuX3UGi:)_LjhB!8h^s@m]Shto3t71 ؚHw6wr.\HW=?ǔHw}D"+MfKM֗[| yu+C1dEP% M 3Wivٚ$y9tO#%!њ6Kn''SSxF\xvCJ`鐒/7e]d!xJ>km(ќ]ζRk^!䤪#SohL#m%Qjmb BN'2GHR98~sdbv43ؘ?סS[j曻YZ4{9k®mU(V׹[܇Md)?VYƴ {x7\9q~u;o6 '߬?#x(^XƝfL<мK7얏<ͻY}ķN o:;RXO9mZ5 d!߸)g?y7Ҕ=W%ӷ.tQ aʕ̻hwBqmTa4f`h_*۹u7% 3Id~1#tҊ|ΡAak%9V22I-0|ЙxMp}Dky2EyڎYʟ*g&fYڌhv(gf-g|'N\ ,Y\JO,8׀]aL21#zJˠ1fÎ@֩ pv|ָBjJ5o٤-}Mn*ks}Ҷ=(MzLIޡ;bˡ4*]N RG%9Gm ;٭P-Tt=gW.x.O m`Y,zFn-tBT(+Ft ^19d5lK l ͳ30LNuSfXz~r 29J]娈 OLc㤲G١/a-X Qm`Ic*GVQ5 kѴP_ss/ kf<_o_>zX+Q"9Lz$u'߼y.vߤ*䬈Uvۀ / S[xxs {=Qqv6T~=~F$4zרwzb\SLr[x=')l|B_,IXiv ayRbǛUn?Wq6r*lVy.kƐm.Lmo+Cd*G${9}RZզrZfzm*OSN$$#*e'Gy] œeb&6\sIk F}7%8hۃsθ} Ox}C5rtUn鷂ygB݀|X /9kH^)qF bG Zੋ5_ @/@cK?%F'U+ՈǩeG(iy م9RϤ&OwX.Ȇd>NVk=0J'j| Mj@:p㽊h9Q*y pu:% F:&um9_{'B4uZʪѮFA4VGI##鐤$O!uwhِ$vgCZ) ( R%) f26$z`y#`uv2KA)RƕLq>`3QdX}nފ-s3Qb82仰r殊 Ru]MB q8R uz&3LgE@Q9e%ʚ:VF5V̗iJ$-tf#۴lxi1bë a[Tgm䎌%R-)l{*i*9/XbeW:όoo׫[OҫEG˛$+>꠼wJaW;]P|PF ԮBL&Eb2 oM_0ݶ>jjz"Ps6WR~GEN  Jzti904|PP!C>QşuC/Ϋ;)uTeC_Agh^[9e(<9/q#ġfڇ`ĦZD*NxUBH-jH_4I3YUderH?0Qs0t<힓9Ν ѼD52ጳDkG \F&)R 6:b(u1s^ZUw\QSHUmz}oTE_^~ o-1Eb6|` Z }@CmhtII#%s5EZ=P0c`VP2LƩbڛ&:H+5)8T&> 2$UI)ƪM7:)Xc,/MZ/;d=1)R,HI)YgZ{e9Y afҧ,869YֱQS/ialO Em5" WxVI[[zϔ,yS`Y~G֤J1rQQRIJ\ѰwOwmWiw {ѣpp~GH֟lkLWwq9_eqqYO@p~[ D(dqzFVhF~L$kc"Y,HxEQDw7WW -.`[nu1%k`fZoHL:]5Wpm晵InLhRd]M:em|n)nvk>&-2ę*d4HFU'&aЭ|~_H[+}٧zw&z.?Suf g~sN?UӰcg^W5܃I++ܐ K$W a=|YݍU};Vs܎(+OiIHm4Y*|кOX29)%}[|!+Ƕ:Ͽ(y(E d~Ļ3{ǻۍi5 Mt\Ƥ+nΛߏ\vaXt$ yHsZ[{GD-G J>L'FP-* ƸN?9&*FIQ 8Ź \]}u\Wm2jgɳ(0#?&||wsʿ\wgsrQ pu{)2Z/:v꿍}MǿOk0k˂>ǫlDru:>D&Bˑb{A]*k\V7fqEYH[t֞*SZuD}HCqSJ>su3ܰZbbc0KțH,cYC\]X fE& r.ghNYJR:n#E*zAp%(^G=|s@#Ev ˈ/y:mt:9ޥWr`0̈i$`@D$G;C}yQ#CcARfh 6smХb6BXEe5E"89N!cܖʕ y՜[#iJ9)HLΚtM EpY3oGqnS  L( gˮCB!Q>B_;-֜4 ڲƸ&4#dDGY[bSg<,KNtgu ZnT|gw^&]ݮoW{ZQhy= ۢܓswq gO7| \1~\sEU&+.vݥ>Z"r/M2F``/4eɥ&vMc~Kc4\]Ok}=~\7'[@#vi~#:WʞN?}I_w:Ͼ?嗶kdx("N.]X1BܞB(×G$b)wD)I%<X D3շ^U%A&PM=ޫ?G):O@s^En`J=k &3Yii^kP0>6d#%@7io3)#Y$|qcP)kЮi*A+fY;t,d\teFjx>Sv}nsS0O?q %+\fN{ ogu/_vnFԧ+bșܮiQ݆t9:e-d\dIٚۊ 83cpQNnˎ4ȶxQ<4/"!g&dO&Y4\!cXZ+4X r`@yb&;2-s.fP*\:"xDv!l!8S ?h@p jTy؍!ARڪ)4^K$.2@Obp1lnk<#݇$V<6ŬX9]zD1#)h RvJQf;QaRH;TJ(wʢLbDE)}FpFldDRm^^9N"-vU͜weא6ԛBN[sЛz hяZ*l^N5j}LBzNx?(uU#\gB1Aޖ|jqQO>5hc0tfu: >LBCqSIYau˃>tTJ6|*:JxCf;yv3UDF~װE=3։z>ftL 4W:215Vtc5o 0+1us Lwxzv R)B$'Zъ$0dirĠ2,+J^(11j˄aT[ vLLԗ) X0 3"*,j5hddmdu^lXcWitV%Vv8V@?j nU^']Ճ pEp2-(M+dʬZa>E58sJL_ݬcg~3. пpqQ22[: +[M=Z9, [ M*Гką|8IVOQ:j\^`r*G3Lx 1Yèxdv  AGh/KL1:Fw`;gRkρ^&ߒ1^đY۪XUdP[pr,?9Pܘ?u;)8RK/yܗbϼ=" 9H-B=bD>BQ N))`N~jӥL8cʈEXmpb8*n+c]]u?ߵu.ʗfw;?IfYcq?pC[ 8tM*"!j~͗`Fn^(j5JTӡM fN@+S F;rKa2c]S|n/PMr΄x&a2B,M6d#1ۡ2 N~Jy;/xJzQ*u~^g-L=8c}Pm{8%Z"n@^?BM/ڷj5*EL}x7`˧Cg,< z.]?*!XU)YG @"yPX ~IAh=UerD[HC*:Kz@X7(\gnL-Ϻ/EknKhȟ\Egˣ8X7nsyТ:s1ƺraެF분UtN)KvZRFTnڠrwjd^ kYpyJiМ%Yc QIݲk|7I6VI/= .+v '@ k/!J7!I%2}.7^E~3u2@-^͛ۨpSu=]f ĠxռGb%*|8- TtqJ$YɜW~.~~}0F˾~ݾ?۵4>wW~6皵3z+.pξ_F< Unуo oU()^{[Rd_a0 Ҿ Lr,A,ǜHt|4`L(%tNgA9OeEcYS^k'fv;gCUy`?i-MK%TQ48-}!ՖNZl@pZgCU|OQKQi)jQhCT 6Bۓ>m-mkvUycBKlH5jIKF-+@\|WDv6suX~v8zY 5?_.lX.R dO䍖ibp^BD$!b켔K $fp^[coӱ<?"5v"$d0LJr{V<2)Nw_Ǵ2F&8rb&^|h@|GM;DVs9НʚE> -ϸ@z= ڿkd ՞&d4MF̆w=$I!ykZ1!qr%eY-)|oGx?)BKNTOU{<von4>f"m_}7̲m0rAdF.WjyuZ05GkYd p k#G'})e%-$X8Fp@> uc jnU_sq;K@ʊMJKz Rp2uY^IM^7KSnWfPVL3KQeeio\a:ˍaVNgb3f+6xvtq+f册E0+GI5Ycen79\Bkx2.`^^ sHr1OfnGNq)*!EZ zE+bsg4`bAj9(̠]8-$w"q\ܮh~R.V!dz$rminӸxSKUpEbS}6x*{EbEQKJ((/ѽIa:q aES. ˿,|1H~/?Ke# p*eqk&O,)&JO3nN+|ZC1h%~Z{ eM]C dJ ŤLą~uD4t$u5PE/̨H%^ՙdN Q:2(S-ؔlDΨ%V,Kt%wPF&3)] Dc!Mf"\T,)SBNĸb$  V82I, }V y GnipmNRmSvJiRkrN+FKq eWJ葶#ogQ ˳T^HÖ*fK7TS_ǖ*c֙a< AE\WBm'D5z8oÖ0~Y/xVSDCleR)ʇr-)u=fp{O]|O>nRA.b)ˏt1tc+꿭e^yF,mv4Θ >RmOZԵMK\9 -EnR\\OHOYK%>2 ~?waC 2WAxf0,bj! pw@e 83 u8܅qIJLygL``@w!zfPفuD%dỈkBGƯ kxܔ&Hieze;]8h|.vp$ِd(ٸ ;vk!0z 6d d 4:+Mhc4R)xDVl˝c.Y^WϰQp F#әƓjf,HE@R8Y^n˂B4xgsh2Ox-`7E|,7'{؍Ll\} da7Z.nؐ(&axq؍,c`7O}.9pgz5vc,?iRXProНO %po6%[|f`g\qHE"e0XCtuޢgn,׭4j ;#YfPeC-Ő6̘Jeg~Hu{ Ro]٠_B&oۣjeeeKPN.oiy}&=@Vn_~?I">+|?a$N|)8+c d̕)O;,=kVt28xOWoZ2T{tͰI*0r+dLV0jV9׆\W\UnHQ*! "WTP$)0))iFzBL;N䂏'w΢_SxkSluOM6~T}\qj2RgZmܛuϻNz/2?J dDBM>xg7OO`$R&NڟzsrfoMը'c >%X^^=?9rN/]/7OLεͫ"H;?!u'&ʨ`8nH1*uxLlT[YC>ӫhF@S0VJK(gJ3V0Z)heU>&8/7h>Qg4Fco3*4Rn_ 1Rrkʋϟ~߮Az]m!L0J1Ivas'Fa% , LBh Q,ciXvb(ai+Xz^"DLc(첄r͒M{v9LI 'ɖu$#sva|9ED:)ߋY {|rћSg?!M[)82|w{Lf2w%EuUD7VxAqKF21H",2kmS#sxVϛߦCCF Űw1@ J齸{37)W^ۓ;{}s+{u/㽲P0>&M;3ʑr6]#36K/H:a0l4uX'(`yeT'f EHbfUX "1 fF PjF|U|b׎1#Z0NWz,gNy;G>}o3$1Uʕ( ʅ( ՈTZUP~Qb':DJbe)MU0+s-JRMil5T ).Ή[e )Pfw )OJk,oZJgN\'j=u3>K雓R>cpXJfXHT+x=T seVX. \/29GHPhAb,#~fH7_^55 p̊* XFn~֟hq-biٍJh.g5UZl]vpUD" P@w.7룸cv(X+Gޡ GL#('k]c,be~b1{BdF(ۜP=^>d\?k 5*2$TnYf0YH|^׽vTQD&Lo!v52WX\D:k?̷\:q;2h %hwNGԌx֓Cy,h<:ku~[CGN}8oAa Ӄ[*;[T[ &Xl;T:{聾7g5^acxAbxt'?o9AG™ \)2Ⱥ}?)r:h=ޑ[m rr [OMζG XL7]w<=9bnG9AixSzW‰q1)1P z,GBģ{=L~LBX;+GFc#5wKOr5uP3 J)H~+XoLaBZ .ALI $*2)`O €eXa LX6M.("ω.PKØ1kpxE &W%Wh-͸ D>/vxo@ 5[n~!MV6-f!VaLJzubfyXtB'  Y~tB 'VQ4:Gȥ;^v8 %շpL㵧O ˱o)` \1iw<wQuE{0K @wC W_3pZ}0EM95ۀ߸#ĬY3HIB YrfgFivT:q>HMrA#1nw!htDAT-3% AQu/ #SCt[(K9XZ TU΋\RM1C%RK(D.-/s!I Uum{-p41C¶_H`Edʘ 8syqnʜM%*ee^$$0(gDCԻbGpVFV-)HGL"JyAlH`QhzlbxF "Ahɥ3AŜ*.bv |ES(Zeť*qJ1fdnsD@qFR"u 9B?kABfJ+e_'J [PނOiX`4VqۂiMv[0:l2įIG0QAHhoQJ(B=Zs"Ba>` 15`X 1hq#~)81kV' D9+mpO+P&_yA0'^s7?#zM:-Wbq?\]Ek4`k )c&vr2CHc>]NHxMs}d&BF kNI)ǮJZ>?Czoj^0>\- }+m*ɹDU!e vbc[u]ߑcBeI00UMID4A&qXF#5gˉr"ˉ!$$5yK$L}&wpda5"<jp[1[P+9jI,ҕPЭ̩ } 5]wcjw5BM~//.(j'tQ\bq}x,Wk|Cݡ>ipx5?G[?}dwAP"v._7\Nf<܊bMC]=-\+8ٌ`O#=(9CzcW-V6wg$(jFnFQGtbݎ{9*4vO=n)$­jqj7J;vľ#D3MK<[ E#SɷΨ N_kHކd~sFwoL;[t\f}r}:^fMro.{JA ?%Dvk]PΖnJa SXL.Ki)w1,-}}ZW]G| ~{2N '1x!Rt BB !VЌY]faw sj4H&VZH%{{7XU^ϲ\ӇTfD|d0d|';Bl6Qf#>l]L'~l㐄R&Ja &"C9I/W1 QXΰڽU嘰Vy 1P$Ux5*8ᤰ6h̡@-LD~z q]lu^ךFMpXGՠA2%jtQ{a%<t>,0ׂ0m亁^fD*]M.G_csBi;ekmppG]=jpDU7#&/j6ЖhUzJ%h=wB7LGc4yb# ?NІL3B)ފE&P;0LB Ju}d-jΌg4%jPSh2ʚfܰV02 =jFZ2Jc5 L沚BBVE 9bKZyf+shECx}rեK(mHff:LJm.a5 u#f7k|*1'd5|JtPh` ZWB;1P,l l0qSg LHm?n+qn3ȴX f'%(@MFn6fdFcds-Ou)Vpvzq8NJ\l|擛TB'"7+CKSo#7OOa1!7+BWi̿fӡ8 Y~J(-'74**5b.0hT#X=[׀e4häTsS;SrxCоχl >RV7eC뎬POK-*![[ e[}tH8lcX};RsZcyF6ȡ և)d,cL!Fw>D{L wk%m0E>w-ӻ 1B!kL.{jtSwNvڞ6Ȣ"vٲӱ.qe`|CtgLe^ L%f8z~, U[j@ g͝{7Uw2UJ:Ure?IkZ> ntevR=&Wb d`9DFU YA)W$r~H^b*Nɀ* |8ٵ1|{s&e~K:j'nRU_b]v;tբՍn +$vo1P[wU ͧ͵0̎jl6 WY?|l#vF/^_IްFT _ar(yŌ.4UGx]5k| .l21ꆑ82{Nؠ]d3 h1b/0pـ2޾^3&sy8N:r/kApMTTkS5/7>"ӎ0ވ;2h.YmyX[z$P rdjzQc=TN(~~z,_[Z.V@QWNzݪ\^9H;S-YNFVYs}8-Ƽix{خїaVɚۧ1ݑ+˥&@ߦ#۾o;L#BD0nM*FUo->s`RI.9 #CTe/ܤ& 9^+*F6`xwGjL*]{5 ,%Ybq72E9TSsiRIلW*8Aƙ{?FN5.OL0>E77%}zYO!^]ʇ/sg̲k`o9@,ҖÁ]䁗Bw E%_ݾA~;{ Z&$@nRȴ`r'X\ж֋{dGQ,v}WԳkM)^gY!_^S`W$5 2T. Ko3F&p|s[O>}K nrIYUͻWq#_&%h!Ⱥv+WMʳ#IW-zakn BtBInM)Eqi}YJ|ݽ"{%b-:@w( } uK} _x`BJ ZR1Ҩ8EiͮK8hZ{]M핣7i6[K}5 vת-_n M'> us`c_p٨9q'ɬ:X0N16y巯7ki,+t QQo44hQYHkˌќ7B{Riͮ),D-)JJ Id?cⰛڴ r@z*%C>!=HY^&>syTdv4Bk\Q1db V*PZmESJM`d4r,KVua.A͕S UmPZ<Aq9ʐk.S@KڈmO :Kfl BCx`ȝJ=AR" !r`v*3છjq6X_w ˤigޞ.ZP5?vmk+| fc?!iwy[X˱ʠ+^W|oŤ<e*f|W{cm:\X]P|D7uHhs|Ʀ67}[ VAdk;+ſ7HRpBr}) .VfOO|_Ɔ\pwwJ pތ@lw>FcO7/.ePFx|a9ǣۙld¾bXznaR;SLL#SJȵeݺM׍V{;evTR᜽,{^Hzp)s `kEQł5D, `H0B3eF]X#JHKbjzu !0YKWQ$elfYsr{g;\jCoơmod_9IG"|Uv[α[̐deNVgKtݍ~z<7̳YzջQBֈ)X',%ԑ4J%ӈV2#7v}UH&Ahh6_~((8 d= 3֋[A*`*2V# &a O7Du2ѠGQ҂y)(nQ:/b !-$g|ޑ3taҮm{WML [ڭ妈j=T=RP藶ʤSkoM*_jT(~, ZtWBb3rO iВ B[#?!rea3k}!FŸUTˌv)ȵ&2yu2oJr>ׅ 7+}Ep=lnd!3RX^if{FrKCN?G|m]/U?*Ӄa::OG8sbYKW89k@af8g >UlFYL*]]*#eҖ٬Jf-TEl+#K}R nɁ=5/[R!뻱 r5UtTr)RnʄӅ2U,,Rj#G\tx$FrF !HΞ9ft($YjK&1,=e{Z ~QN_xUS7XI-݃/%֋T{yNZ^+d羮å@n'rݫ ^{'_^27ٮwo(D7HYAs!FH2|4$_:_K7/aqp(ҚN4 Fjr] Xb.Ihoa<bcHܵaml˝Hn^a,`mɉˏLR dVt`9ljEKi h fT6O4,\?6+Pu07gl v'?]dWiRR;7{d\I)@31=i++x?9j;[WőW<ZXU#JM~ቃNp=N1NH妤S_HQUSHiK$8=iG׀"kq02\x۴W.<>AWnHT73U^IZs׆9|{:t"*W;!5$\P#[BG޳и<<Ss5 KWVCfO7FۃO_kN(+cz;(Fێ%Y:'vHw@.,܁hpl0M)Lo)B ir8@5I:pbvx"-'3Jnhzr70&7Iӛ9@j3gp$򚌎&4CM"A)IHb:DN}}N"GiSiI=aaʨ"n2oPr!9@1A^cq 7 JY-D: rkZيsoNGNaGswQ ]Sr6* WQ(A9)s7!0HܟZ0BVg0H6(0A?G6$ȃ1Z崚h^ Y(PUDLBAReAL[f^yCHc9z_ E$!;7Dx1!:>}^rAj5HdcX;k-21UiD=L߰O~^_Cd߷25]w|['Yz,˧5|mP9EDqؘ+M2QLp7~W>j!Tԅy'tc]Xl'd)H'aR{3`C-m|u%^}(Kkmwkqq~:^]1d긭S8.\bZ^bc5Ip]|օGwFq}EK#'v}>}?q&vWube_῭ct=Ͽs.ŦNPj?gl>" [+9i,bjѷ \{j)DOaX1Ȳު_I`J D*3MNy>3SPId!E^z<$4D'VDm֞GukaSaje;l R\rdn+Lm筫vD)μ-س(PdI>g<'be65xz(M?g2'A83m'W#C.0̤%1"CԀdg{da2:'(ԸmKkT VFak}=k0u'Fb }b% lOV i1YP|[16HZ|%R:&P:IKo|md qbWkÜ8\8;=Bch|.~,_'+=mN]4)U\}H6Q1r&$F-=YԆ,j`ŐbIhFE5ׂAF+H,f'1v1h`-%G\UcR W%r(I*I"QAa#Ȏs%xގ|+K #7IktAtw mj - `̪Q=uIs))n,:Iq054 Y]6(1ÖJqF$c5on 1Q&I;깏";z#D"^B( !fTb"q$7/I#Xgdh;ofge T(LHU ro^\>!F;z^/FoքgAJ:򍷢.."H['/\_Ե#!)g5!i cN(F7J8 ļMM3k(L04y=i^h#23P7)**2C)jUkvtlhb $ݑغmNCSlHmվ!@9*;5mࣂK#S"G%t)+L[]Λ* '+[IJ|T@^)G@!j8&S}YAlV /B ?rGOo@\!P5Di@SUg` Xbi rtD y[TsޭkŜNYAH^8>CB&?j/ dsFr6>ϟ/M mc^ډ7eϟn{<(R̊J&9х/'kDXҲ@R4;W_zk/7\oc^XW4$0AmѺ{4ªH5m,},I:h Cǃ%A`^ߍFG4) LQڙgvȑ40  c?ydO`lB^m- feV꺐I&");3`َ3[rmQ7:;,H6lz8m0Hi|4mΎa3&4 f6j>H)!OnTϫg#7xn>қuOtbvO)MӉ '=x&3k="Fz'0CZ1s0 {tv1 ;Qt6Rv>HSҰA9O=OΏڷOx6C45tv,ekr26NIf:vfJɅ 9GJM#ʉ\ ڱC ˂crMQRXb+z޿hR02;hy1 yqk0!i4&uAຼ vEYF٬ GliN~p֛;a J5ͬW$26:AZRlV5TժuSÆ"-cnH*\alf.[ TTyp~e\AڢZr(Ԋpf:2q=WĤe<'CJ*5HBB'lӞZ〢>B7U3n9AWOil.cߒmvU'y KG}wԷ}G}wԷÎÊ%V9 δP*5 vɰ_mzVo ~p4n۝͛ᦿx@ӧ(|w?kf}fVk%@PtW$?>r""9S6=R| GX[~WSVٳڀlME5.*ޮpxKQ h׶fKOQHh͓P|a\!5~uPXQP|X)>N5ƛVbT. ֖WU(ۊ}lJ3ĺs@řm?lWLP z89?#U>;}:;z͘?H[PҔ<DJB>ԦFZ Ǒ2Qd,r;x5QW)[7q%2gcDlww޲S6'Ez?G27F1ng 0e"uF l-S؎i#ڂڞ6eg[B͊f+1ˬ$5"gԌy~kFLPF9u :/(b!>i:wۇ{W֯گ/w.8橯|qp޻Oi?>~ߺ~tQrFԠw&>'$uQٟI<UKWOӓ0r̂@bGbK1(<[&'w{b`OQkG lc)@)ށ,kŋBL%j"SfRI-'LJH%OT $窙c/ ~p%r9傰P9HwEri.h3/4S$/{ɒ %:_RbӨ /ӗ2;zKlkZ~ gXN4.!@Z\‘y`!K4k5T%6*z\P3O]gI( rv0$.Μ+؎k{6CJ(2%uђK:3h F+e P@uPXwB23 lοkPfٸNT̎@WuJ$D݈֕}ΡUp#[誎k Y4,3ȵ$ə j6McFUR5V4ƝU{"|rG c*QbDuA~Ȧ/W_% -ܳPx" {z\5bjZo΍*Y6=R8ec~  a}QB{NYfjHKW\K^j3|of;YzKi-xvj\K+~>R? 8ݙ*7@jH4.I(wwC&ENOۇ? aWk:hUkmW~S,['G~[C~GB^Iޓ<8Zdot |E6#ong &*tɒ rTPR Xf'fj"S_.- 7Y6,@ ήcJfFy);ۯC$z@_5(5nn,UI~+55sz8lT]-tӗ#*;Yl[ӂt'~ ,$}?s~+>QЁ/ ࠮?mqT-ζ뫺:B]g6MrҰ>kņR)+z]"-9QUdLX\|B,,bU>-皒,Q?W{wˡyd/ qD KmY,"Q2 3rJ8B:ŧ/2Ks):sQċ~M+R7%ߏD..P |'KKf5[ Ay}7h}ݧwۧ9fm6u@D,ڧ~v6OuRP?= ʎIk (G;I$8MQ6Wg050SD u6_21\x ('~$;'l>Xٔ9u?vܓ?ޓeQ ̑ZQK<c:{6=LZl92*磨~{";|>'%=b[etGC?4?ϷzIg:@{ EQ7IuR9z͎Z) Ib z)}'XgMK!&JIK/9ՒFhߔRO-{=8z<|1C2?/G$YDW}cm\KչRTF*TYy+}S'ڣaZ]/@w90^xZX_A5^+m |?84›ۍuH[7qZOQ9ō}td|_4c?7&Ӡt]}ngj~Oz曄Hф~1:cߎ^u/-cN|gz|}w{~F}x}@8ҒL\T!f]]"d_M!^(Iѐ>s߀> ;Vzp6ͫ?OK8T. h۳Zh em/Lu~%v @J'ݚժV{]H\enx!4Ykш|8^!wvXsz҆/A5È TJTZ˲ 4&e $M˂3iYgY/ Q4 ʀ,BKK,<#1/ $XI"!mWBTP8?{~FNTDCumVWι΁ki Yqd'-h`g V4BJTN80n}gjW5S㮫D/(:׀}e{zXKJ(Z5I?CX-U$^'JMD}eC/E$o^WT]QaE=e}ـY þ°"0k~$$~>v8ZH"vp,WD?ӈI7bwWhev?]Wܥq B%zS&S.!XϕΦ?Cܨ@'uO?3¶O CfL$aE: 郴5^ 0M+nW.i`pE41k3)YZ/acfFz@sN 3@v13_wn=w\0#:9ס+wq1DSpGnT7\x{3MDI[o݋ @T5& F۷ XsQo mi;6$JS2FWSv8)q 3s5;sqclwzfǘB.umxZlYܷZu>QW$HJ5Z^Rh6ju_"Cx03ag%}!rA{ux-x- ^?^ n,26JN2su5bsm{_} G 8eE@.5`4Y\ ȾRD2Bq?cz NҲ"e< J :c(e  =Y֞ Bq׭3 khfxo!K,fP4t bk[+& v773{h` *e KWu$}K1WR;8'kpbFۢu5X)Y!)ɲmt kUTVZQw}F9h'a1M](?FOyS~p R)evy*_xX^7 uL `iR)s/߇k;.$Pq~4d?ػHr%ݦ(5ˮ{|bQ==wsAHmaY̬%?"EB*{^*m^5Qm(GPot??nu B5WbzsRJo~od{Ciד'NKf};gaoŷY`~z?/ܧWOϫ/Θ?+ݧ?_GƒV~^|,_j-#gy VaE8 V|xV/(Jy:Թ(ioR;npV(2]>Ovc%Dc1I4P ȳaxLn:<|c+JU:#Y"M63"K90kClBƩ/ l־[?y淋0* /%,+fa VFc_}*,WY_5Ivsơ JtrxIAwV,'WGab1&kū"9<^ߘ1$URZi낥iaə$C"/(AE̟,e>'Eiaȶ3x^Qkn_J(xhT)OIๆ ҼHj˺A,eΚRIo1nMvTDsH1L0KQfLÍg&SEHejYDFUsBV8Rk=J-μJmD1r ; o\rܐeFg; ?0bMjNʸ c\%%V%6US)pϤMmBZU8mԾps IǏ`a#ݗC8S۟D(=L !$CX0;+55qggc8-9wV7Weh a $8&Hy6>1"p-ky\+/0QY1{t_Wy 7!Vmp[}u]qm}v(ka si3[I=~lмO$`ci]siυZ ԯt!@+{O;"𦤀S\Po ƾcUN#iJZ`}nS 6c,wnC}04A6Tz[J8PM {B,C-ކ>4u C~ATy\(&pBdt!4QRhנhoθkwjEN:QtKo-!N_4U)RRIn s"(-BEC2eEnujTF:+t_c UFifZ1}^ WЌKyZ+-K p뤔cA`!.LY4kg+F^vu Asj"E/-M8q1js*p1JW͕vpj('U8o32B3}Eh"s)1[PdT& fQz^S[7 |?y(6qGu}$|v}3AnkPM?‘=WSC=V;Xaַ[PMhHv\V9.SMU]laX<_lN_-.CCF3\߼qF2sgwVޖuIՖs=?^FcMg.9h1ܯW]Z ?OU(ܞe:ۑ|"FԻMW[[N1bx֗ Wn]Hw.mdMBnMAA#FxLL/Zݺ\Dk2XFI vv|,a9  >CHyܜ 66ʸ}(eze̍p;bƍfz!Qo"Cp0{z>yHיou;rnal%uֆUcͫ۽}a?g:ᾌ4]H .!iTZZz@bn SI%gq,H\%FIDɬ$VT.ZNLb,Ft$T t$ $ \ J7ooUd=ȼSN{:;g3r}I-x%ƛRzzZ:}mu?&Oc'|\'-J٢d#bl[Pe2} Ԭxhe$巟l:VĻk[нZ'X#>^swEGdFBA"4Q,)3pSDgx7o$Z1jsk36n' f&`) :mHp-5@&CjS5ӅG0"m0H)CYJJdE2y[ɮ?GKl(X\*0* d`8HxfsL3 2E|T>krv_Q6wqσ_Կrdd+O2Ży-Xn84fbp/̓6! XE8D9S%bqsM}s/9ULPM}ܻ-Zm˚zvm$DK!S5/a 1VFglH $ا @qh8)gdR58'Zt@( Nh..Umf„*?Xj;A4U:4<8JWʠRn>bj_RBhRO+݊dMוvNn<߭fkzYUvb/OwɃҮ:\}"^[˯և/m0kFuw*ٽ̒6|͉@*(۟5 3f[ wLU1Kd3g Hݎ f ZG,*)! /fa5ws>>JhF~(!a97`>0w( 61 $~x$p3;ֹk~}Y cztiߊwMў ,A Tul$=w:H1ޗ=?CA)m1`5}mVz;bJv.἞w"9ۉ@czM$0v DƍiMϾ&<9[&#sQeǨI|=?EJQ cw7I߾"!M,)f 4I3L C[- e0RD2E@+24;7۵p $Y,t5H /rw,GM4"'m8F6^ yΕ$G x9 1 ZnU$ zs+3vH*#Bzi2tڀ nU~Hi$ Q6RwKcMS$ s1rǨQ \!ݢ, [cJ,Vt.-R>CPjܜ@V'ǩQhsWӖRaR厯"RObpnpR*TεBS/i.֚d 4@dT>:cQ^ՙ0íppf[fO~2:i6FNj 6 ~Wцc4B[')IC!e8r>Uq0{F$D&U m34̂2WP0,sowpwro1^y%o;$+$-Që^S\0@㙨d֢3M<8bo\3ReXv*`i; $<J́42Wmv:B"7t}8W;JzDfpT MmLxv+uNTVsw3tᵘFK69Al%҉&dMdŷm}1F1sBĩ0s.f6nL4gX#3.ZIwoOp9ɖ+A`m'~4d 5:6D* 2ax` ~T}{{GEL> jOjGT/[C}u8@u8mBPhVS1b0zs I= Q݀t6zJ w!i}RzRYgc),s#řԓL46 s )*6T^l]ޔ\pv=S/VaeV:wJwg~#Ǎb%f/8ᒻ \0(rtwoRݶrMh{zaf<XUdUJnkIK ;jXi3j"-}'yʈSdLIQ{\; u I,5WPY*53QzTTBXW J*I9g Rـ#GaQ}բZ'l$8WHj>o\]aW˰{[Tu/U `R.3!|DRW!kilɮSkքYoADgnQ<ܟ-ֿv}n~!:ABP.S@| Z9~fI.2Hy- W@y kC(-UX@K+ 6 ,:C %Xi_2oLK[D^(n 9 qeXp-PVS t.:EZutZ#끇 :DPڼ6鉝 9tettzCls:pR̪u 5Pg„踇;WJ h_:e&LyTc4s$'rh4͑&]i%;X ڂ /BJkIe9w*AHJB%&C$B/H` l(A ēuR-EHSڭL`@p;k8lJogcSXJX|X}:d3эxMdKۯܬ~Ё QWgp".OM7mzmJ11 FQ C,7͕qG׵ ldq9; ;:恛rZ//:򺸻FT aK('eD(%&/_9'NQb 8G(q?P_y~b;>Iyk>O=E[agjN;yfkZO@3[4TkEl?7L"&g&MuiШy+EO$ KG!$ң\Yc;q"yNMaCwǻqƉ!w!'=1i?!kvu:D؛j7 $nxtz*aS-4WUjI}:$u"B?U Dl Eq-m2&-!ut_w[g{pSmoo!ʕ_'xFr/ܪQHVAQ*غ<}ƪX2N׊韏.bW>ŭDl/"Greb:f̾N&nhm M4˦Xpǚqn\w˕A~wqtd-x ѻ a!߹)@20VWkX])o؋.IVW|n)9PGKW}5;?*J=x֫6S5$Q63{hCȐ9|49{hK-,H>RG% Wc=K% , hasgO% 'YHtg%'?H.[tZ9+J((9XiW(.xQVq=gJz<$FcʫߜONJD·涾e>$Pw喤WAaUߟ? :jM_O`aߥj03Twk"#C8#ܐ 4fސc" atRlU bŤ`9X ] VI-aXE,0DMw9RA6&'la#BJ{Klx/5,$Ҕ\ !31EdB-%it%^h[0z"g"B`hC)aT6imK 2aVN4(ξHX0%\-6DE%y)b#QІaWA40ۮlp%4gsLb)]፦u?<vypܛ!!%tD Q4ۈY1pJPjG :d_+aɤm)((}2Cd#tT Cb>qK@NFlBX%d2L6: [` @NjY`||%޿ @1q3H%r2xN刜IvLU} (ZA! PLh>6߄d/?↲vYŝ/ʻۧ !f}^&oct(=6{Ej + XXOhQ.ղogzYoz{fΧPMSx5-r÷nyEKrVY[n{cQ>,w/G-sѲ?7-{+Zs+Ï-:Z*wsmYAN*֕q/j\_/Ϯxp={)@.϶֘u4l_]{HxoW2Z2gϖxRUm*l (~4JU$7'sEk`qi:wվ-Mm,,+zߵzUȪhu^g` 3bc j)[7i۝r:g\fŽPT6!FZv)m!C I` '1dEe,_N* PrK)*gAQtQ& kޯk~5>R@?Ng3oymX]G\|?cBnepz+n툉2xA>-.a{DKlu q䨘P~Nܹ۹$*m0Ja44f։2HNuu.@I4j.̕q|\A8ES*.$bڹ*7]0Œ nC߿ ;@X"\Yug swn䖮g1ǔaVvq,0}Mt?f{j+=DL{?3XsN?;;btޢ+Bُ Л!=_fttOPsSfGג܇Uq 4g ip ͡@jOaGOY & ӭU*w$|y1uBՀ)P` >!0 8A|&(+"_jYQvCouOmoI 񖴕⶷2t hf4-J3vM `* C9d!FL_NҶ+8w "Đ`}F|` gcYTfY4;]Kvm}ã˧aqppuIFhF"ƏWMXh6qDSBHY&-2eKl@B=Ea#XQ$\`mIu+ Xs62ʦge(cڪOjv6H2mp"cnkyB&lqաn&ZlhqkŜ=X +lD82p6+F%È$q`H0 ]OFd^@G[Ͷw/*7VY$0 / zFyz_$/Q2A"tmbN}}&밻^'.ݽUXem%g[Ɩ o1\{iH& B)2ks\,d\*wt HRS&55d%J__>7 <7 y`= ?jL=ٴ VO'Go?}ZjO(1}'Ɂ}ċ Nk=ysPɁ瓃zԈb(9tptנS`Z:ݟ;9ʈ1I #1 |动<Á9L\)\#f6GFK# sJ__4k 0Y}ʛ\^߼ʳEInм tS2 {_ yJ.='6ESxX-vM۾{u=jnDMč5ꝉ7%aRmH@FO{24jA!+k3c 4[eK+zVBa 8S(c޽v"/ bjiWS}vƒedvh-"]݆C#59Ae> 5#=͐ՅZhn`@ϡ"n7upRp2S $q R@ɦ Df]$RXcN' VHEV%N*)dDiD&!4 b`L|jiVۥۧxOA*wSLtU w$]78jtL'4jrpDK{ \ 8 <m Yu4H@q c09vC(`,HiXN3"_ט㵰ba6q0c.0|1]-ኬ;6#CA6fv8S1 d wQ #,6dC) X.ۂt:Sl.c@ m N}L{[)Y9ˉA?v&lZ-j:D江o|Mw`TNegeM}ko2zꭟ~ z1gEQ0FrΨE L *GJ`7 ͹x~>H,5٪g7j;л/X0_-GsV嘣՝ uMMw IkZm0닆=E?VWk!S$ SPeD+ˋB % S([`h#Y˒eYF%5ύ/ZOԃvZ(_*k/˺Iy Vu?-񧟟w&NI73AT]xW;3ΎN~4bZi6skow?9zt'@@օfj)얐T=#"$|-_ܻ ~:R+抝SFtqȩt]Z)=N0"}Y:!gz@,fcTA,b N>;@9{ʺ9sISQVFN-W_Np)~F+8~❧ݗ?8קټvFȲb#ުŇo? l= tipa `Fd(ˁaVxեabN]ȌS@OovrrcaQsF4$ +Hb83(= .4BZR@Bk )OW^6"swUkq*N(D2P fF gE NT {*[[{{T qħ@E 1|onj\ z8|EFK WgKg28_n`H0`ɹ|s[)ei0UϬO7ɣ2Q\O 7NG"F́Pow@_}Q ߳j=UcL/hC cr>'t+WiwMy׽fJQ3(:({ƺZR%QDb :IeZ//4yf=8iQ=sI;Ȉ QcF`M_\` 77PMNj;>absDr6C'ljӁkz>9IOqX<:u%An[Loμ]n]@.8= C} N@ Y pktcOkZBEHF``#Y&':A?>g\?#0Cq8`&Lmx(Em_3mgfUuъXJcXӻ6[W ×6mJ|zziWW&uZwA̭q%Z\W&Iv:h@)0vbЌ☥kO lbИf061Rp"wS%]˟ !aooߡ*eD5CGK\ ެLpeFޠ zkcלsR; 4qBAa(  Y|mJʼn.p:t%A˥DuԼqX2 NJ Ű>NC&;[$ hPnNg >X IGb1¤%72:CEq: @Z%tXP) p$( &;>t 2ed`Hhl0otg(4rbNך ]o656nۦX@.RJVkL^g/hg[ڰ$v;ͦ*7:ǁԦyQ膡RnR9V/|CkDhsnln5XYs5<CLຐYTn8FߣLr׼_גjj-f[ l KWk;vg^ȭrW7V#euvO KGnvuVjr9C>h=9*?_3±B:_|en0y7[1KxdJUȯ6r#"^$>܌`&$ l؉Jl[jKU-GXOXzaG)C;,,-|ywy̶|">6/{׻9H p<(,?8UȵSʥ5WIQ7Zw_|^Fusys:DqPSN *"CꃪǪ7O5["!:cVSTSssy=ocͨnxTrYazN*ty&ޱQ1(;t C\ԧye8M ,^h}*Qh+ WQҚJҹ"YBI:t&=CPdՐ˵FՓԳRP9`%#|SBjJooWh~]/AIv>l\f{0t*Q%3&>]}+|;|0 _H oPM, *^ʮ) p)J:q1+QFSǧkQoaw8VخpaB!fxBKp]) Yݽ)@^*KA͉6w,-|vmW{2Ag5zXRHG3h<EYWv0 ?^ţ[1,VsnTK(,Ϭ)ɞw;' 837LJO|vLR@pinNkPu&%)A6<謋Q1_sØTGJ֧3jF3ID\d9eNQe [Jfi%qUW:̛cl q[O|[RSq;-)n炈װ DVą,#CPG<:w޸ql&]ǝK5ѨՐ6<Ѿ\fS$6aP&>~qwa1m pU 5 Eb,Pr䩮')8bcL+c4 ;ejEA9c\sZ5X 7jſ$nn H0M&nPٮCH[)[,l`^'/eH)Dwqh?_M,M$WA.k濶ȋ퀠a@/z %?E'A.ԅR,urZ+䇼6);Rx`m 俽U P7Q(m O,w6r/ZR/7ٚ`hg:ݛa66nk˯&pq 6]p}5IC>111f$PQẑh #t!i>S(5=6$O鄭@t3 VSȀ;yxT{K*rCi1 Irww&oBɺ댥BC nwsyR*딽|NbՎX{%́RD3[bI6m\)g~FhbԤXQ`nksQS4b=0H[ZC^iLY3y~|} h:1j .glaÃ>u;MGHTk .IogdL$xjDJ ]ܤFSUN,^XRpVQ 2R+IVj#j^ T̥8a<穢`m{\eqb! Ul[c c%Ssג! g%Ͼ+<BFvZz]+FeWz0 M4E6 ujƃD&@%Qb% | D0j8qғ5(ՔhMUCBoN>vР]QBi*҅εd ,]hfOE| G0,tsChL.^?)SʨcKdtQB]QV zxRزsLkq:Cj1Q \MʙOMW~ߖO&ɲb]ɕov*#"*Ywj/KxҟW:Uw^ V):Uled¹%"u#ʬ 27SP ϣ, 9gshIa*ԲiӜr&gZrHÔȴ Ni*jMssfG:Ʊ^FymڂȃT`o!/Q}U:BHsG"\IiA"HGWx)%F!BIiZ ,',4᥅ËxԩJ5-8)QpRj⤴Z s>8m)'PxǔRqRVU-qxOBQHi:d4{kP>?{˳tR"3٦ no|il;"+C7rКy"ĹkpJ,=ղZ~aBA[LB3bĺ˒N.JT-uԳ J֔{ D&i[17p+6YPiI#M`چ,ۘ5)Wk6=5P={i hQ!4rtJՔ}'s& ,)S̔g Y$@CEgL\8Cs%QZrJ S,۾$EMPK]G1PY0_\xFB$K\ܥm>jΔ9b#gw{ ,TtKR}aV{՞ù!p 52ڳA*$s 2h _jG) 5FRDCUC N4Q{F֋oOgm~ls]#ȧAY:e]#""H#`ŞG0ftsxtbL.\?)SbϨc=dtRY Q?v^9bSY9"58{!HŞgfrW 42JAŞ2Yq13ώ jldb}HCRw\&EOJBQX'34M0""2v:h{1|zbpԞ^=0NЅWsw%[LW~^E5sg;/NZk,^$'e[edJRĹ"YBIF\(SYEJ]#.KV*ʲ<)}F*R Q&zvm&Gw&m4b~,7//М/Z|&26JwwЌfxuS(u k֌w0tkf.!S 4Cfr>,t-My:\i@UTlE`wa}VҊ^.knfo}~2BsUKMT[fA@fplqKPRv"_B7CApmIkxrڳ*NrRp'IfJd-b^Um8^G(PڞV(돀Vá; ԃKF<ws;^?wʺ(v^bDvsNMO ;c~y7Zp=58s7s!$ |7ܽbΫp*:l Ut l 'v ;ʳy3JUp7x( 1#N9E)56F.[֣{;7Z-Z,0 R8՚ٺNw57,x2ҵMF]Kv{jogQ?ܥ,K<֮luήr?U S"i ;t=}"irr6GM ݶvaz0xqegu8lE@ϝ枟XuK3TsI0:=&smBٽ(zMVL`J5^-Wrӣv2zi>XL 5tFGIj@mzd89b{ m=܉qmCv,p~oV,BwmA(3;Q %̙,4BzEdI+;'OǺ>6g te% S^_6Ybܕmluj7/whQٮ8ş.E{p],K^֎b #f8{:>綘0>b{=_dL8w貁sYXj*n$;[eXԠ\$!߸v)r-ɆvSҹcn]y":]F;eQġ[$Dև|"E{AuAt}Gv]`Fn[h:Eiո~aMLWfHnd@/^u\_,͇1`z,L?_|fWI1ջ'+(_cj8UzIʘxԂrTص㠖,M][Kyx3)z>X$ՌHi˄;niUX chsHm?GȲ&e,) "YvO:u`.w}H7fY^<<g˧R3_]ބH:8$Y qpH&$ HZf-瓨:AeIvHZC+=yPuLBIzL>;udd}늉fyhv. SZEkޏ/j>LC,ؽ~6dk1wyO?8/xp݆͚E@K$+cw _v!(ئ$U_^N~E7^ $_AyL&zuo} &pZX^ ]jm0^϶¸ɤb'&0JWm:mWKIy]M՘ xh pLzݻG؍p3?7L'H rľ?p|5"í+KC-JmqzHZE$Gn~~_ՙv2ufPdƄQ`<18 JGhE'i_mS?5''}qAG?p0:P$ gNaH@(Y$@h)+{2yHMTˏ#6v_H"{Dn-iD!rKVbJB2*W!+/H/<&@qSNs$]qSI+Y/ByX_"$##PFP:K!t&[W'߿[טe 1Ǽ.\&]=}V<,@G+|돏#~<~>9iHAy5P}X枟' P]I?gj6_g7"٬O='KJNH΂:??B85 j` tx~vNj<}_=T++~ZmqP;4lu+LA&=q*x tz^2pRĪtC 6 ( 8#] 炂ʱȋ d|J:ܝӎuhx dn){837  7Í3;"2^Ē-tI@2e.3}.+eZ 6&!MI!7D4۫_hCGGP;'!xмڵt*`-h'E MP:vtxǓG5&={4CZo x{'mabS=i:x޷$8i-샃G&j }֖2B:ɍ!&\녑 4RPzچ‘YVd"xAPni%ЉsS4o;t#^V&R,4YMcw]!*uAes˂2.EN%:x4B,|i6ؘxaT[ju캬$n#8[Sj`&[Y-1tI /tTVctoRWcgn/F&6vGd ;9/r$3P"5+ J~&8 d`%va}*نIEr{9IrU"ArIY[ؤ¤8/FHk:{ZAG$71J$նEAA>eI7%J>LE9kz gl\!ޯ>+.^"0B^vB"*2$~7udy;Mj;}&w1D4EztP`[U2Y*âqJΫ+Cvvs;R_C,(`>ēD/ƞ% -A7W%>Ab1`scrg[k%v:̲Lҁ5//]U g[:gٻ8n$p^ #u3#'E+HGk&AHNwbX,~\_=njqQܼ:?O^;ku*S}kE; A`J3F\fxO8BJ':Xv@%W%f6v&N]]?"eKf=cD! (aР sxԀe0p/+=uѹ֍JxѩE rpDbHܧ0lG1&ԌaS2R N:7!X|4ڃ˫a@5ԚJG]*DM BL M,oy_vq[n>hɿnV5]V{uswY/[M1~?ޜlm; ^7v Bj6^t>yC3)oeX37ӿf ɑI& y&bSOqn9xT BL'1m tͻ n9,䅛hMi#=ϘE얊A餾#ƻMRѼ[:M^wa!/Dwl*;!eRO;;|3MjRР -L"LӁMagգՂ+ʆ/}GJASOitK|3 mK%#zH=ZaD )k&n#0T:ٗirxxuDqTCAGӖFWi[bdҨEFwq/o<-R?(G+F b*x"ǾaА0(rX"AP h-"SQV;EBV]tF`' :C3gK-ϞRsݎNf/.MXWr@u|Cȫ5o(H-5f2>vOjgQ钊APT;&^|:!/pM)g=Q [*!6]uh-0л尐n)6[FލΣ/I}Gwr-<л尐n;6mJQ%VRO<cr8:At)%aԅ4B0VVIeUkH(VkUaTUq+#*-1%K;BH/`ub|۝m@<KL[-nSu[lsX5ԝ4+P3`h荋x{wHPhn)?1`.E;6ahS5+9p~,yi+0n=tθV*3{1f&G%LQ9>Y@W\~*~_vwbΘ EDZ{jc"w@~ FSC:r@_Z/qg SR I1G|Ύ}otuhlM ̩#u{XLQ9S-FWyͽ^sSG0שK5db0lgP#k]2~BbLZ\-|$8<zנ~FdԠWiPT:Zf|EG4L$j,5vZykTVq03veIv[M5LI% I*%Ɏm @IP}iۂq )CQ]lXprqx;Gf;Lw$Ur4#?p 859p ֚JW8TP <$Xje4#ʉ=ꛃ:ŴRZsjkG"H'YRJ-®V,a[:Ẅ́CQo<a68],b#>l*A`BsQ5nvajQ:p+VNJTT ɬ ǐAb-*|.O-SyPfXb%cXD[܍-X^1,9)GWlUMwmq:@3).3qUeIj挧,5i( XT`[fbs{Nѿ?ǘ)ւn4˲0%.3\xGN.o>v%AA P9x%m%LKr&]-`6Rƚ&lci*),BBZK4.QEemMUq$du&hi˶FWa6bԒvwx _0R%VЎ@KiݣJ8'%%xX4a >s,f^l! 3*_%<+~Y)WxDfiőU69wN9+\8,oPrU6=G{9"vKnt[y 17h S8yx]cx?bE;ʳB* kW9n߿o6? Atf#?Mly!n=v&©+lZoMA9%12U8hj$Hu#zftյ_>"F[ t^ .SwCTZka`]D 15BlcF߃#UzdڰJNnVȽNOn̓ :ʒ  @ -dk̄5ٓSbuZ- iTRLFAci ցY'[b*%V@pfJjZ+K1]s&G/PcI# ;ɖ߼y3tokUw\kq0 +shnx-}0jsWB 0<://WqHnxo}xoa(-$&¾Jb9U2DYPfGrSݫ4D'Cp@%,[ sH.맒p@%xw*iPS$LqPI ۽_OӔNgپ5"oKѕFjO'ٸLji-b&pvY:pĿY|ikrv@3i=( <_of8c''{3ux?E:5GF6oĸ497zkerw¼/3fv%T;ݛo&H 16 ŧp7oV-ǘ+  ʈstLc >ׅBV YZQtp0R4RC[<(2rz BgĘ:[TƁI "?9X |#8A_m0j(ȿQssm!q*˯Fde? \N{V3g B3hhT_x[X;Bq[Dm'G{֔'uAR ǧGh"=r}|H~#n%̳Q E%HvQZ) O3-i&w%L$h\F)6GyL26?΂Uh_|\} Ҝ+4!곎9w; bF4V} 7?lJxNo X~!(y!q(MZ\4URh"bGB MxN-V5/meSwDDkNt4 1V$@&g'H5}8 C3&&Rb.au6Ҕ8L-N ֩XX1B5UdwWpΎ3[Y dW}9i c̊?BX3RQfJ -qĆCOb7g** 2EZ^&~r)Ej.|'1k.4"k/7T/uI}bhữI#I훇O[ ;%/, " D^_soOObwkBק).OzΥY,~l^I& >zW,iB07>} )EX3l8|0lo͖.g6אDڍ?nnu{/Yju3pg JI:} Uu{r1&nlL\CD X;/'vo- y˟ CF!ꐣW M TMzKvG:J1ĽKWBg2g#2МY0ke4.Rr%8Vd@٭PhmG85EIiO˲BNre&$4uLu"tg~ZL;r`Df}>.l'_h>e`Kq&؟ЀfXkjwdƁ234,+,V%T$vlR{[6yR!w̬ԠP<݉р0x`W > j< &0xiqL8O@ztc4ѱ}==})r>u,T+54&K  Kf}y\4 4+>AQ8Z<{)j!:Xש"d I"eў/C~yn;_M/[J 0A_!%FY AR)쨅ucH*CK~ fiy7Zm~ }wH)@.e1k+ƖKEI TXP*i5U:ŜL~DMZRx+J@C+*3YZqd 5_f9圕g+We\M,A,l3 l IIh ЪEIƔa<0jDА .܇̙8eLtdİ [j@tNJJTs;BZ &]- lΨ23zP3,3S; 1׳H-=;ge ڄ֦Ы䎆sX 7$jUGғ7朳h(}z2BNYW6w.ߑ{֤}JxbXzz:gZ(Vs5+XA4ب݈e BT7F@B*WSi)4Y % . E;]o7W]WKxZ4hڻ/- K)Np%K+Y)ΦAc7p8SŬ*f/01ḄDSdaQP(xVx-(܌YevZHI'OA; ;FϠv#a.bIp)ɮ{dg} {} 4}rHHؓ;[2O1UfgSjmFdf*/ߛcԹ /NS֙/(6I@ $j72S/϶)-}P@\"j+EaYs_A(#^z91$%*0C!-3X!PG%1nӒI+Q8J/mSqw~q )e4Щ`%3jX{KVRPCܻPgYLElg!=]N>;$ƍ6G9,uY%bMtl& iT\٪0bbM-8o̅.TAF~#ĸ ͜du!ߙi ,܍V^DC<0f\Ò_GvB6d+P*@QWf@}J(owp׬*)y㘻faN kV&Y]:$ J2Q[[U RD;n;Qj^MnuH.'˔&Jp^@Fc82dLqص;< gf+w\ t8x&H2z{s8% +I{ |.v~?My;q;k|gm^Mb:\zh%Oyz|9_{o)n,ts㚝RćO'$q;zv挟89~z/}4#ͳ=/dnκԩŗ9YpRPԁ%Z½qf<]pQ;V$d`+}S]da 1@%b`;;sjt]gf/iCjq멏cAPۋPe.[w`pTLVH>Um&>/U#}nLfT˺Vg*n]tc:`n|>r*إ)]IEk OhSs{]rɦGe.`25L}b+ӥ֠Z)VSJBώaOC@l|4ϖeu1(Ofn>jbWhP,Hp>96~8 oep[\Vױu n6SXGQ{P9F..HJa hzʰK5wWWYQ,T!WUy@N(YQ[gOKjp\FױQuQ@Y>LO=f0r͘>f(z"MNZ@5wFCG'GU/?Yq5#5ke_N'#vrQqM_}^u2r++o4qAܬLJAJYL!f$e)$ tYBqF1Wč=W$MA6J+`X8/m9cpRr7JiVMR2n-`[Aл\ʝ^]x:LRX&4ױ>}^LlQ싈/(hF򻥣wVUAt.aql^<5`# "2Dl ~l8b%7c.8+zU-MF߹]d#h+fb7Hǫ+P3`fT],MxenJyH$mKn JMeWm{7:ͽnu-ۂj9hg2A`V @6@)#'ɶ`U|G>h!\>l҇s<y=lQqf4X jðd, l)EBfWVx4)QL7npm^eSj+ zOb|2Q!pHt |vlS<ڿdV'!xwெN&t"c8x70lAiߌQhA^:#Wd!wke6c斐 ?uN8jm ]o9x3`|sELxuLj 5 5LvKc(PM'"Dn:MorY YX0q]n4Il;@74 OAMr`qΡQB59w^KD9vLxO;3['tHWHTJg$y<7 nع2&XS;3DLUJgU*XI|nA+EbpnaQ6ʰUT`+F~ni9e^r;g%h/YaXWV+@`iZ!m¥d7"ߌûo*%yN ̢mLkrbƙ, E t9`w9C"(\ Ue$ ͛іTĐҕg<,2zMD{Ą8pNX pf`o%\jy@6ƓVHaCGd"g"DYQB1D& VLR%y\&-P>CT8SJr8 /@Cr? +Zн Gǽ|dVA^Zt^ݥEШ*)]FۙtGs5٨ب EtTձIo&/&@Q\&퇷}VIC[ߛ7M]ԘMK v'[{3#=x|tu3WTj$6_&\|OA͘|a.΅Nae>QiæbݛBhOcw[ػJ8E`yx}q4nm<*6^\B[,@֨ Fx ج<, 2ȴ>#0%AO+VJKyƵtRk/fEznE70l986#qEC6JƬԦ:H0aV;? -7u.HUwѕ^ ;(^}/B p/DD!NKq+Ztc$5ySd8w.zX?m|aE~~7^,Sfv8SW狛b!_y_ Wq.]|2"= xJ;.t/^ tP[KS.3=>t!'j%U)P/#:1ŋ5'l9XHGsX!wu* jd!bEK㆘+p<כQXdIKs|G4lζV f 3 `#_Gߙ OV=2gnB4G&u$SnQdm >!oUp!)orv{2Ek9kgA!(KIO>3>x ˜υpsuvOY:Nyat+* MX[)F=NCSpFϛx,'HC>@ N~O]t/Q>|^G.g.Q(iڂ; )SAb0cn2չ nu٫R]/?Y48/apH2DU y<>\Sus)b-/U F3jHZ5Jy+)TJJUr,T.N .1CgRJ$L2 YOi$ڮ!(Æ 0 `t<3(JpZS˄93R*cW1lic!=5]2V2qV%ieCƸfg|pIni kZr:}PWC>(K2jrGE,Y V1G 9~ʲ }}ݖQ R!݃*Դ&2X]v&'LJr5R$eůA(YQK*Dexݽ<6Y c gZ/b'SR~xf.U883f%Z ,xp.!XF2KMߡ& VTicכxD0ERyG |sW6 _j8˝K+Vv5F zU2ց\,.IK1+`ᦃ;=z{1nbSnCfoW@֤dU&`s:c,(xj3Qlmcr@$(0;~nzǥʚ2/d/Ez' kzoNi"/^l^(twCDhx͍GOfS"H)*} bDG JIH-ڷ$RBe+R&aW;r\ nvϑ[7c4P=p7 #.M<+eKD[sK\߃n D2~zg7 Qf,h%cDC zBNƂM+NZ b: O.9RIrZ2p/wwW'5.2  +'f,|,d=߻TһD%tL}>=*×'z=ˉ"FH2a}M)>Բ9od 'iRPi0 /h71}Q4}QS{h}ao` Ɣ% ,Ӕ%HW!,;ehlv8$䜧=MR,>ub4Z0aR %B2IcwOjG )*ĹSe:#w]Z>Æ;Ed sj &I[klPD:C}vxi@ArLz!B4_|vyhp._=7_DÚ$Vi,Pbo]U-MqN+9ݶHBzro΀d{]JA%˼cW3;[\3d^cY\iR`,۷qd|ļ`1.5 s¯'?}^Fo^PB"j-ϟ{߶@$%\AG-,.6$=5xsZKb΢O9\R] Ȗ%pssi| F;$?Vmv< R6F^S 1ĜUə"C(g\[h ܙjI; 6' փKfKڽ%U2(@=_eR*ހ %Pdrj00( 9"LN1s$aEc(=L\ppk6OLx<"|Ui줺w{dFQfd6{O50 `@i(ŎXB .8UX܆(D!J<C>U er?nrA [7-A( 9$I!<pn4F1B#& kl3YD$W%E ^ $pq3͙4o@0Pb`lӖfgYw3w&ڄ4ϔWb6wEASf(@vghgx70Z&1Bo~qb>ZД)9p &jA9& cT:|s1βL m)ƙ6@\{o޽=>^j(c^rBf_uol!Q'Z\m8`BvJ4/%aDg!Ȩ4 UuL*itbr˧F2W}fO$Rd8aJ֗uY_Ѭ_Oj C-; y>DNnXPJoeGcYyHpGU)[>g];q=S :˙1uG9]=8t}m?޶RG ޭa{kASXm{S (tnzڰGwxS|FU "O=Пԙ= H(z^=HpI}GV!vثtr!Yx 8r_*=O>cH~Y L0ȑudMPR(/=(Vs{̧wP'vVinu;oZ^eT)eL CzOޡp`~kfu7c﫡d^L* W[T^e5+E\hn'-9\ BMQXqRӜ3J G7o5 16CD¹He)j gR۩aqCCΩok04nQn d5@!uHp({XqZH"ۆlܥ0x9g#ZO蜦hƙyA0z?|Sӌ`iK`0TeRdFy\HG0v701o8S$ q#k<)\?4 O`7kU_yH d c{޽#?S7(m7gwCe!\*`K&?Ɛ 8gl"B z/w4Lc1 Dv1@ANÝ(45 G2^^-SË^ }u͐GA]hD₧8;j9\;p8fFf|{> hfG^?-K ٷXS$C5mItͭoiK Gj(Hl`9Yu8?|u෭IT7e$DV,`-'wJjN5=FABJ)q >$7+lN5ٹ O5NOy0H'bs'͛uŇţYW ^>Laֺ?]j"/ߓ1{z9 WcD߄,?>r5TT"<޲+?kHye׎rI{ɭ*\ߔ!sM)FFn*~-Iu2=ZV֭ hN :zu[*1:F6.J#[zoHֆUN1*oxV͡o I8;KŜj:׆P0{6ⲡI[r8W$DRQ0'5"_y*\LqW`yEudb)VθsSq1o8I"# ɺYRZ.\-T ]J˵[Z"kæ;\:}W|}_, &{4~ܧˤ4O"г5Pp@$)e<4)p(!~އ*ؿ@_9UVj %3ĦW.[otmNV:IĢǦ͌^LJmR[ e3i˨6A:0 7(K}q JBG8]:T#/Yvع|,0Ӧ`v^%euoE->f>JMhk*}=W; ee) DG¸,#s{ju~Y3":k@Y /&[tn]"Q7"DuFJ?Ij?$aizݤ]Te 6q#җxQjaRds* yD8[ۘI3)aLrő84n4эn]<(X)9&'bt((5%R"un11>'{/9#r/P<t6pbϽ&bXXNA JڕyWfn>y('eJEw'<\7o("c2tzʏ^; "2D߿>|񏻲,|(ԍD+߹2WjȾh&?~{t}҂ AbYսMKaZRJ:ddU͒H fR5Ϝ":09aB_7^MiϫnĴeΠ)qMVl1-^Wimƻq<̓`I9smU$;k;܋?[Qxz/;\_FLaeLP몧%CMV1TP-tzC]YAA z|U-Gf!ha#~r{@b xݗ/ ޝAVM#;FzLWId__+B>+l?K%n7DtLxsmiKU I;UXj)4ɕ|u!)K!h v<ՓU{'} E*wb楰`VCOШD۫i\n^jigjA134B/u(`#ueˊ:j[V7$91N)圱] 0-GIC|).p=ΔV|,V"C $j+fl1\mltmfwTz86EF/;rSv-F&6ucX5~E␸T}myY($i48>l=0q𡹥@z]tdp v/wG=03L==ϭt/ﴧt&s=9&՘ %azJ O!c !'D"r0{稞RMT')=j)e1.cBJYdx6@G;9!(~@H .\~_*[s[p~:f̚/Oo^}nׁKbabn&Z>ߐE/`;+S$+y$vpо^NC,˕3ma20}Kh6b[B? bHJ xPĉ7[#E9H;;jN? X~%W! 8Y]9BJ)-h],/yc ~ge`e`e`e|:˼(,1mYQNwJ`3gp2SEet0z>׉Sb@磺*iǠEo'2 ncxI7BZN9+&*4CLZ0(l x~j)|na $ X,8pLEͫe6)rOehÌb@Â#k"sٌI3 ˬ([ MؓSnk94kA%*wC:qeX WQ [u]9k1[8) TKDNKi5 %DXN(d{s ^.?a?8j}<'LD4 +]zkxN#x.Y(G:zM كG #]0S(<`a5LkK^"V8sr xÝjtƕ&QԱL"-1.)*zMD&|VdS)mp}.wlphRbېWlfFzae0=,Wb%9G̃.!sA٢ ʷP 1LrLGﴠ837M hYO99TOVւSሃ8)%!$B!IiI^-lxңH]JXRALl[S_p6P,Űg"cL f7nUGX-ٸWf 9ܙued-ݘIg+/>h|,`s>?z 98t F쪧8ϧ>Nz86`WK#G%jCmӂg#O ξ I2׿=MI&]Awr:BK}NŠ#Š9/ԜJ8q RFy<7?!U"VNg2ӨGFDc|7hXQ/uLǹF"qz\.!Huv`߫&O}ifAZXy-{ڨe}\:U`ϖ_ ]UPb~Ո_NW_XFR6kRnah.7^x0W-aJcLj^s ,BO9mhsAt/]7Hs cIԕ +&ވ}H bWq+J~0 JQAEJK 91Lqђ'Yu@}w)R1&`XR`PTRcI+C'!w@W9!& %2Y4EƤh'T~\CEa}C|V?=**mK\K%,oys 2\'dg$3rQܕ}?Q׷g!5@_F˷f?V6P1fspe˯j-8ܭ$&Sc-BIRf5ֽ/0+a_}s "UÛw{)C QЪ#eOiH|JSz౥Cgm>Hk<&$1E)LB4&W4ܷTB䆒0 .=T_9[[Q~u{U?卍ome>OVI?{MmZg֌/|^ yٔfU'Dke@I4">43eN0I!$XEWuJ(S[ 4E0B kNYY1+RQlg>4G:ui?=ʣhuc' M4 ^k\:{<&F"-ͭ3s A2N!Zlʃ ?=ko#7Eܞ4,Hw8|w;"HI[-&[-ɣأ^,y؛m9L"b2 E4m S/cE(3TΘ)fY<he{</ILW؝م%syd.>qT3zQ7KITqTv2"*|Jfc,~r9?wxߴ̧ANz"Nd87bmGOصp8LX{9y7-7S !P8z!KT۶eVѠGt_VbՅ>յ*JV%)9!L%YOբ]31c֐ "6]ig K}L6܏-@(]2"1"1"1fnqk֒ugHJQeoG$<\ d.^E}_YuV{V1Aƞ%`Lcdfq323ߘMBdN}7/>EFv}B3)jebOERg=^dx.NyǙVHLC~?JMa2]ZL dJe[ jåa_p2B"j\[OP0IcKXwӢLhؾ7PQc;bOC}5?8h24# 3$'ElU{gmXH֭Z+яdH?|~c7goΖrOB/8yp1xԒ;b.2`ȼ%C9&cc! q{YI! ׳^ ޲r2XR ^~ Zy`^)ZyZ>suB{ <@#IBt(A9o4SP|nPPP{ ڡ*)ThĩWwԫӣf0zUQ >jT5^(I3Bl]>9lOB=' 婁& 嬰qP(VrwאJ7kV9fbN\vNEul8dO@{lh81Au1]cVpBNyέzu-X2]bzq +un,ZAhmX*'œ Fg/_║fy'I Z"'84gg}rΜWڥ8u*LUqEr I`5ˁBMOd@jBݞ͚KЏ<:/Vii]z2VʳRVFpm+^HP"N-!58{uؗãnK/)~xu${20zq/o/on{\ˮsחӫׯ^NM] 73(4^*FbUFKީoKvI>&4YxJߜJ>jw[!ᑙ8-b/uGr"3wU [CˏO,y2p㫟 !?`B.r̤"xzt.nc|jk[y~nt.GAeN(L(#0gzV8؟8D,#1n}nG+#6/yF"jȇmEp$O񊚲4)/2d=ܸ;iF4u;<~Y$|c봸 7 _.b;Q(_T Wq%v/lv'E$+>2p׻5⇠ GtDy ,A%Ց)RO~ f4R3c{1kJkTzw+,hduD&XQ:ï`P[\dM O<ژBp1{Ĝ2V3M~s;_y1[q xv}=^fkbcm:i!@]:q.9p?W9hG(ˉ]bb?Rx5̅cM͇K~psHLLHLHLHLh2yJrg=YA+V`T8QʆSI|/])ڷ^O[W%2r܎\/of7 ^ٱc@>GZXRoii з9͚ ZWU3ɁFP5N գh{.8YkvAs_S;c;+e<jnb"dG'EE)]8Lq:Om<lSa͜< j5Eܻh?c!tGQ+됡3PSFbc=+QㅨL ԍ^6 jic,Oh;)K=9)c-ْ<ӋL(BUd\1mj IWL#qC}qh. DBmM31e}^KtYn'35yswiI^:8>~:w}{]ݼK:^D~d7q ^fs\mdlr޳9k`YǮ=vq٧t,69fL1[Rf}<ry `\RԎxs:F:0gjT.Aˁi|PZ l9QēxLhxIό\1 aߗ|x$ 9b׵;>&T"_ OJer0:5ȫ(JUOp-"blWl5)1v%%o}h(@% ҃ bk0Z22`Z]ik"fHqm"q<ڃw_#cf5L1T _9$@{ 0>RHZ.hsNu*PJFRF(J7ەZ $ 3oXyC:g!!F:ZI]Uc\[!3b\ Q݅>d$Z繑vCoY̋eX,ޕ[sO/fFDh~ϿL`̗:?g\M6߳o,*%G~tƵVoO>h0R2mCuszb==B(ͮԻMݧOqFHqm=3cEEًޓwJ3i$ rʔ>Jk34f)I㨓4dWbzQ"_9j>__d盳r<[6|sv]>ܾeo.8ȀG,^3aM=6P DyPtWôr}'2&q5%zΏI.&o&5m{-Ux] kw)~Uo rޕ$"evTyzvy醐YikLK2I$%)R:XjXETdDƑK>?- { ɩ՘0qނxcFЍcG .ߌ`0:EhAѺMO9=-Gvu]J+x_ã:n^*_+zt)&c^tڜuKqH C?sծɳ Gq lzT`TӍFN;A kH jIc}Ћj_]Ccv?;ݙj+g׍EE=vQGnA/Pdvvhz WҞjI3{>1G[F<&TzLW^:eO~d݄i*hbP:}ƺ"[B6UNٮUm#z􏬛žuA}Fv(d8֭KuBC*Sgġgm=?5;g0`:*]\j>c 3&s9-nTnn"GsR%WO[E7mj F[K2.cNۮ }'uYD`L(yα=ϸ+}FlvN!9 Rj#{Q'?7tqV CRJQHijodEK EFu5R+j2-jF<:tj[CYc fuڮAş6N~/JK-wfܢwvSύPK &HFȚ^$}^\ˠ =;-{ f=co8d7D3JMnZ)ԣ"9,;!Gg w^Eu_Tťm+2nk%QM>$Q_6}0FV6 M,oV<$...|ԫ/wk'r(8\3riQ܂`]~ߑ`SHprUuܓV3BŠҚYG恣@lpA맫JA2DI٢0B=h Usc9( 3 v\h%\I2 DEAm f!6UDO&.ZcA^D+p"FG}+UwuT5k4Q~#B;ԢZzaxḺ&o6Tne6 p#X߿n6m-,>넟ԃ6X庯 oVp%G`SͿLZ4- '(oG4965-Eh|`%A>^U&E,Of[%y4(ȱ՞`{lVtfcӫq]MHʌTU&n2M6d t?9F} ph[woƸ%:YZEm 8ED~gOJcMʻIJjm 'denD#WB#7,![J@k( h"X\ BG!ϒ(EvaR9SbŇ$MK3LQm(qKR/:Sbiu$wDZDޤ*9Sb%QZǷ49W-ג%;oJ6XP@mU*"Y+!)(z%w?{}{/B¾L{ ܘA!0qJpO![DZBVpZ!F+8]CN/X\|z]|QfE_DH /eiS| 0ج5z+aH+AW,Y1#-̡½/$V_ Q|}BD9Z=#&R3b<`M`kq0<3b;/la=_7@Ѣ\Itkvv2P`+ +=F43uȏv_,Eno| /GX-_vyEnE(}<|zguI-#xv}]|pAfxo):?_ږԗ~zn6XƑWFC^Ns@:p]Ȏ76]dny#4QlgwyMvX5dލ!,\f[v8H*|KUMVxU(־)|{oMOD-v:2uz/s_ ;K݇b6$:q@Z1uZEeR6'Gؿuſ'wFpLd/g^N9._wRC ;7jηd e&tD-?Km!4~soүoAt5J=֤JhW%G|F?߉""")5CG]IWEk$ T()?2?_Q gp(Y7>(d9n]ֽ^ڬ}N/giIU3Ty^g&O&NV_N@3%}#/z'xsĴl΍,QN4Zp u&y)+hM!9bYvܓ'ag.]fuùi$1Ze? qHPCւIug4bۢZgm>C+p}UkQ[ dXj :~mYr ?k-35Z6ӯr<1Y|ܔ3,aޯ׳/-sp(qX9[?[w_i石7_}JoiIg.~=v+rֈ'}K|[Vpcps+!ppt`kD=A5X1J v97 v^Z$)?nS횣޴Ej)R3{HN?J?gLLNOl׸ ͉_ @NXW1BQFы;Xfqz8;zXwV7qԩڼEoz/OU;Fs*̥u1p**W,Aak&;rΗ|(- #~ۥɍ"cs3/rחZ823W׋t93Vq503e寖e ƦqCBr05 Wjg;oޢ/N mfGd -=_d{Xe:ԁ}6!CȬቸOK\-'m/ ; ȷ*acu25Z&HfP9@2x5@1Vyu݌BF݄ ?:]Ft fb`<\_/ z,]*NaW,.=1EUP/y &AP|VZ"y ٔcr9PVuD)#Gjћ;Rfv,q ).,VRjFfZJwl(Jž̠(g3f3t.cb lp@/F#u%E7싄H\^Q`tZZ=9./!^Z>ћ䍜?{Dx|BJVcz{.bzdZ``V / zG%Oj<)NFh+ЊpԆV:^=\_`nNATP)0**0^ĕC60 =9AE_Qo|/wpB "3\Zn"QH]M K ,#9gNH: syYw\4D"&0bXYkvȡ r7iQWZH HׅEgH7$^/N4zHEƩFRϧ3㾛XJК,m' Dm,z%6[#$cRʱN`q0c,^2 fҠ6Y ȕiבu@(z!˓ Gq(hM񆗈ȽViBx쒞/jAY~VV2XL &icR)FϢJO8]BW>/I\$s=M^.ihg =&X?b9Z_hyeX^,<iخU{j4n)r=؍x͜O@Zh.[~Z,<]N Ec$.)O-@b@{! GCC'ړ ޴1ml?*RLd E*w5ҝ2O8vڋ*,0!9 m%٧Ōun РPXpPlGW<8$Oťq!" )Jdj e+Gm%Ew# By ai|Qɡ0Kn_m!4&!*)bFe;AѠ'q抟payVbVger}@fMB1g*dN <9ZbM>]}ne u3][|NY]ҹjl=餐j8^it^.,mq%d4%!IXK혔U@\5>넽V܎D!N8pt`1]-?NWqBjtsT=z|qoŪtirԥQō݌y-7IA|||;J?6xw@GjK@ @ E9>:gIOhy E%Wm/[KҼ=ʕ8F ^$W{m2\;dY|l!H06.SAڛL*Bl!7md#6IJ"iyB镸; IFv6:p&Z&(8?F Ņ\%pvRjFz If,TL$%K2ٳ[)$l@-3,St苀d%J^{.x%YgG.>2iqn~gӴ ~'<s+TrH;pPa4C ;Q{0K&CDaƾbg0 SYJ;tuvq6z^\-A (ȷЪ2)BNY98|_p'R;uܴ_ĂWvwYϣ-d8Pn`4Z޲rnf }!0z;d[zQP#"U(\D>M8c_ ~wG|lc$ âIAR=C MRAcmI]o9ߣZؙ4|So rX3 (J).FKs ŏ؝ 0_|bɪ\F@6?FRwV*d$C;|gzov]q'ZM@0fPRfdC1mzL4zPD wd*tY^8~N)2ϣi1dmuAFֶV^ڔڮwdvؼpԱv+n\2Md:#~(9&#J'i=сN_jת=^5-cgշXi{\k=vrה]~xd^)4s'JCfy1%.WFrqñ?WῶpN\9\[k r7})"lpEB7?V Jnt,gQ[@^u,e&[gFqZg7ūjx[9@/WG~]ݺֻ>m^ix_nt^vGK?OaqVMƣG!ӯtrs貵9Gh1Ͽ= (TS.YH ze_|߮ɯ9v$K[-{9~giIr.\⠅,j?ٻzIQY 9LchㄇzoxUh5siG]gd7Kq\Nw9^VF3w駻lWA?_n [67burɖ2Ӌ\5̛utQhTȨbK:6Ia͜RQgk_z5{[ R(V l$Nt'#2#Dl:av>M0|hC*@F%rmc2Rv҅\eO:AZVmc\Ocۿ=iKs-CkAGh<MI,Y1^P`|Opx|ܼ }mfhd0YBk\,QX70/%gRH5zDBwu|c96ikQj!C?l_!bIST2xTz-3 7'~mo\2nrN:Bpsm^1 ;OaKt(AA&'q` cES " #r(>8ןul >2;\O^jzX"dI:KrN}ȱ+r;K5R|^CZT& BB8' K.NGbhQ|onsHqۼ |˻oa8!O=SNݾn@^G#A:=˺^&1O6%%&b0VQzLƤ:X1%9P @Z-% zt;7-q-kQND..$IX:{.3CSՋOʲ%"t3m  . ;/ÀlIk<8:t:BmPbo*_ % ]<HdDR蘈، .fbq>NjU~ҹ1mGI:+AkL HO$⽋Xȸ]1Fwwjvp@|$+**#W5H@.AWۙ=u|27t'[ٹT{(%%\X%K@K?{ϼdF5;#-H+Cb8 CUm* "3I#s{X4F%즖~F>͓>`!C`O٣&}"ɮxdHjN:NNҝnW#k @C\aN e$C:qJ؝uS/  fk LN&]k&Π` f!YQjQ?aw:>;l~qpPhQG_+V FvY4I"=7pbð*۬ۖ>.R޺Qy4vǁA9qnHA&P2!D!7ڵD+:%Tf{VFUflX9t#.Uw5&ɾOLR4#c-:e+q]3sUm.e}:Bs7!䓳Wvę[jtAc>,VA\c ގ'gG;Rº'jS>\$a7 1ߤ&|~nZdIAs[v(Ah:7ؼuh.։6M@p_A&]LVh\,&e16A!CeJEe U_I9_(MkVXLde<:\L,"7^eh2k =`w!?1w9ܓ O/x;]b\s2Qԋ WHeDÀ4 sVÈڠQ\c֜ffJ]^<=qӈ6MCô̠5ȝFC3b/B - 0"~jX΁?ܼ98|'7h&urj$jn%& K=@O%Dt)('4؝;`n.)Kg(wgorN{q>ڊ+SRO~#[dvD7Uܛ.L+N+XPuMuCǭ;Φ ]6j=$Hlk6)!>qRnr7<ן{@=8cܮVpl X]jRjWD$w /߃"kq8D&z)4W2V翠"~DɦM*N[a Pu8NmcՎ$*AL[+A'C * ^kmv'K[o!Cй^ċV[-c]&&Ny/!_.\h٠fG`:59|X-O6r^:yT)?:UZőK7?\'D{ŤYqnexR/{ߓ\jN?_aLdzEΏeߩdn;yg mK2Cz~2nAj\!wYMjG^ :h ز*D)Q"bov?SEлlQk`ѼGH'%h9Pj\t)f7`>x՛aUM9 لL%,\|FYeۛ0@O5q2 z-hR*Vmflp ͏]Ιk͙gһ587wZs T2^pB Fm`XZ$)`%>Z͵P`|;bTs=bSTM̚\_[7Zr|~01$T{2q<ϴ@h)V@  [ C_(k?Tp` dd(FF&\,B`r|uk(|P֑8aڡ ɑ<0X$!:2DPc!j1 h#xa Eλ^;vF:1pzр$ 0pE0G2 IJ߿̶ kA-׃0.s8 %I-uSՋ\;l 0- +BrA ^+ri>yw>:xzuэAp1 ٙHmN&@< }/zvb#I-ݮj< ~t;3\y(~V(~W4^ ʍd?pz1Z~C~%l|:?8#≹pKxvhtwx3V@l7|N܂rZ:9`\'m V!x}n^IO=9lO9soK޺L)FQz$ 9BaYxq-Έ|= &2:\};kGa4PRYmXb: TK9:IrQv=}j^.x۫JC"G(y 8W6uJM Iʤz!*4xlPS%gl@ h^(z*o~ڢ٪1thS< ?SB*Dؖ.l 29 PhA ȷNU&/mଷ7ކzLpf^Y;꿇$Gڶ*ac^eNZԊޔSc[N5ᖤpE8xzp.v*Yۏrcl.3Yynp.0d}urYֲhLfPlKo94v5p۹ $?pAHޣݮiJB[ZsّyUBۑV>`7!jB.;R-Wl {E~j_mCZ/[{/vXosa7y}rಽr_1..,/jޚj&hR72LJt7C)zon!? F8/=?'aVc"בpm%SR=PI!AAF;+TL'Zݺ.d[ҠDt~vۥxL`Zݺ.+2UX:kM:./:?V_ %DˆKD`ے= a9\O:Nqᬮ$VX &H 6PǜсPm$D* JtԴG o!NBPg}ASo&R?_]}oozCZڑaȣּq2U F1nzZ;j-:/HnpNx1TQHj,&ɜ4Nf4Kd#El{ {Ov.F⑏Tq.H1Zt^ĝ h"$#D:I@'=%>PB2r}nh/$e,Y[@-sH| ;>d"}el2OWt*aCRHNEKgFd2i^'G'DvwK3c0/^a\j.1{,C-I=ZyŘ0 Cn.4AqS@o=>0JuKufVY88.Е[a=oӱFX}2UTVR `7@Td|IEM_6.hLc^2,*9(1cwU忚;K'K`{AQ8yH^$Ѻ{Akwpvi|h@J/ 2Y1 fHx)&Pj*aœu D3Eti};$ z[暀xґI!8>θP7@ qܿ@-+Nj`tD ǯ8))U,HW2j,y_qLI^q–Ce$%. _wMf}7?E{ .7y~ _HeQ*ViJLYܼZ#Wτy]Օtc ۊJ+1V6dr IKy= ? KB_/c%Hag 31#j49paec#Jj?P[I[rV~fq6 G-\,l]z"TO'\Ԟ @s 꺜 5Sπ]6`ORU>0n鼒?wglrZc=rS/)5DQ!m!XQӼ um,F޶pID6lI{LV7x-E 7n/{O`t[3>"L\kw8. ;\Vqlk3TW77z%%o`8R3 toÙ(@]azX0-k"2 V)U$N88P蠝܀0Vy_0@f95(R߻(DZvo/;Ap|W((Gx~1~-K4N7^|;j^|F$g\_诈CHf>&աW*3hošpRV{j{]Z<=8׃U>ulz97X>BFs -#Ý"8Tnk &kƘkXaԈǙ=Jm`ݫ)3JĿ[P!N@SaǠ-p*s]GTLjScԁǖmn45߷Z*ANCMMǜlGb$ "IL09c\L m"p)NRՂ)YHbY ]#`DOoz4QH.4&]J A(1HOC-9hgOMZ@-EZ/X'G4˛QABx|xPUm2BغvW.; I?g}us͓[ƛ_ c'aVc墣# y"J,h7f.[[No4n$ 8vkhvBB^ZZt[[No4n"F1vkhvBB^n-SגjK/,Noςs}ut9pB\]'-%q =y^%bą{b3kK Ib= NVHwlrYhTi1Y$94ऑ K?dž[gML4<:`dɒ $HPh\sE+!F sPIE'ģ7h9-P0TJ$F˘E.:`ȉvxʸړHN/:a+TPY12BQB~tpH+!ǡi7cMh$g@:pBt/RT )ūX[T JӨnsj5RZxC6F"r}I-WR˻wOӞ^ĉ96׳M'U2s(*Bܯ}z=r83g?hsoF7\?jA}Vm?z0;2;kG(aI%1iZd`ְD)ns ق5dW'[ ]R[ ktq ]2;kSvSIe@% 19Q r+wL#GS,Jc+c:كWb12"dD|mhI${ϠGu&b #BC-^ @ f%šf6l3" \Bo%d9hE6!,ThbSLD5EzJ_'vɗrJZԖvJĹ-S=<0!IIODv3N4!:3U2b㑝/1 rLR!VV>J7}9\="Y$\@| ˷;M6UV5#{=#%kx^,V^7[Z(|bl%o9MBlI##E&!Ę B!@\ۭтFt$ Ͻb{q+Z(cjl1h*漽 e +0}ὯLh7{ r MеPqe7ZMU|̾tG; 3ئ=0Wq{@0= ˜Fi|HhG:v+K{ePFzL, V̳(ɏty|U2vт ^ Wߠ?_l;Uۗyg)GW+yӣq&LY0?ǧ.d'EKqzBnJ_/`P4/O3ق%?=@7jm@r){Of ZX2pnis&L`,ӅIl4YPwwVBlzҞDzm=9Vi"ʞ3EȻqQk,9}k.=G-ЕZ&j]_WtHRq{ ]xUn;92B}/M{2[ Kn9r%( *3QTe.mm՚Z 2ihR;i\.Y P_Bm8'ݖ2TL2QEk{sb cGk@>߸;ɹ&7ͻ;yO/٠)YgOd7qJgrT'#fo+puM[I[ǥ2gctS[-6K7v}3\q +qoˊ_=ed"p`jG1`LC3491. U!ˠ * *69o<'GY9=n <<2R*҆^F5*WAT&F ̈́sB~Q5VY῞BJSb68YIAk}ӗT+m7J{g ^,K_{SDDoXiO}vT/7)9?ŷ>O&_ mi뢳^N {U()`9t#KTRjbyڽIqX  :Ca ٹjBRݑ(ߔ.]!ʺgeKvׁ,:&Ib* ?jXnq0 44\tWE+KJ]8AJB:F X=VGHcHWوDYʂ쁡:67'ίً":lcr>T}jkaZGT;.8pv+d3:={<#ߺf _ Qr%NPB!CĢ,SD0qU4}Js]tvl-$}0~ ̜ t}Ceu=+PokF1,>TX~Hh:1b>|\k{qf "6FOZDi{Nz  C;=vA\v848p H21u`N>1kdғ 5TI{*s)gHnRu5 $Õtk/rfY=<{);OꆳQrtMOknیO"]<@qᦘ|:?mABP=A1ET|/uP=ptb C&}l<4=FҾ3`]~Gr˱Q]|WYZ Ģ=ZԿg=կռ?cv{+n)q] |v{f0bЂJ[a흋G*+fwԮ E;@+qny '^lĝL{~pZ;XrdCHr4l̈owݗؑ#ٝ68j xF5r;*MBeks*lv:V]r %@4;z\G1Faum2!Epާ&h!b_ٿխ0~x[;5"SdR>ϣXXAw6&aZD&tv4T0v߶JWto{LJԷf2NtA :k# +A$;^2%џM){ ڭ/ rD;hZ_Sc'Lfj!$;>2>z y];n=i%:=J!i5.m EtE2`pD{mkksVҽVLEkP_@Lpmz٦Hlʑ-@J_rjiGӖRT9R󤴆BbCJOQJI)رd.'5ښӖҼj3ҨF{RԗkPۏ'-&ϐܘk u~cL)^Bkv}oHO½w$ق Ζպ>LA.(l`<܋%*P\ޭG1)w"/@遄K62|Vv [_Nwn M$nCHw.e bLW]O Hñ@V?eã{wR60dVTJgHՓ| ,sD; 9Rϥ2a|/ $+o~^ *tt![v߇ĭg]^'< ۰Stz?o`@zP Cսȯ!nbF}`8$9SA|>{ !DkGV-,wmM,Apnzgc `*qB ?Rsc('j6 9P_FfP= >xԍG0;NQLy`L$iv?aPB 95 ̪A\ɪXJiiqRș@FWUiɭ('0K>2r,T '2ˑ/B@.siV8xD (cY.*^4[*PZQ(-&L"j҂h1ئ Q$-]nYNrN ӋXj\Dz1D_;"1TQnQ4f Ѽgp˥ |#!q/x|i>*ʂrur*gkE>&qo1 CO`)lUBbpq` ~@AyAu%3QX[9G192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.943848 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer" Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.943871 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.274759 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:55Z is after 2026-02-23T05:33:13Z Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.491816 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494494 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" exitCode=255 Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c"} Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494718 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.496786 4885 scope.go:117] "RemoveContainer" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.275430 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:56Z is after 2026-02-23T05:33:13Z Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.500314 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.501189 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505081 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" exitCode=255 Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8"} Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505323 4885 scope.go:117] "RemoveContainer" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505491 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507112 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507742 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:56 crc kubenswrapper[4885]: E0308 19:31:56.507997 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:31:57 crc kubenswrapper[4885]: I0308 19:31:57.276871 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:57Z is after 2026-02-23T05:33:13Z Mar 08 19:31:57 crc kubenswrapper[4885]: I0308 19:31:57.511475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:31:58 crc kubenswrapper[4885]: W0308 19:31:58.005556 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.005671 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.275614 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: W0308 19:31:58.895810 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.895961 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.938463 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.938665 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.941571 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.941996 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.946156 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.276755 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z Mar 08 19:31:59 crc kubenswrapper[4885]: W0308 19:31:59.418162 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.418299 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.483634 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.520192 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.522483 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.522775 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:00 crc kubenswrapper[4885]: I0308 19:32:00.280823 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:00Z is after 2026-02-23T05:33:13Z Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.275947 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z Mar 08 19:32:01 crc kubenswrapper[4885]: E0308 19:32:01.335017 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.344816 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346622 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:01 crc kubenswrapper[4885]: E0308 19:32:01.351449 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.049434 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.049774 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.069437 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.276442 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:02Z is after 2026-02-23T05:33:13Z Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.530052 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.095483 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.113805 4885 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.280135 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.434958 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.435106 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.677077 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.678366 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680132 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.681155 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:03 crc kubenswrapper[4885]: E0308 19:32:03.681459 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:04 crc kubenswrapper[4885]: W0308 19:32:04.181904 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.182010 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:04 crc kubenswrapper[4885]: I0308 19:32:04.278340 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.909572 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f37573b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,LastTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.914078 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.916093 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.921095 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.927648 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f43b81836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.477350454 +0000 UTC m=+0.873404517,LastTimestamp:2026-03-08 19:31:39.477350454 +0000 UTC m=+0.873404517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.933256 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.578724893 +0000 UTC m=+0.974778976,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.939875 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.578770334 +0000 UTC m=+0.974824397,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.946635 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.578788615 +0000 UTC m=+0.974842678,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.953734 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.671899753 +0000 UTC m=+1.067953816,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.961218 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.671973225 +0000 UTC m=+1.068027288,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.968012 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.671993255 +0000 UTC m=+1.068047318,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.975040 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.673980601 +0000 UTC m=+1.070034654,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.981864 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.674031562 +0000 UTC m=+1.070085625,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.988629 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.674050643 +0000 UTC m=+1.070104696,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.995507 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.675140353 +0000 UTC m=+1.071194386,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.002505 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.675175234 +0000 UTC m=+1.071229267,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.009454 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.675194715 +0000 UTC m=+1.071248748,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.016335 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.677409206 +0000 UTC m=+1.073463279,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.022688 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.677441967 +0000 UTC m=+1.073496030,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.029705 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.677461207 +0000 UTC m=+1.073515270,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.037363 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.677600891 +0000 UTC m=+1.073654924,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.043721 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.677625442 +0000 UTC m=+1.073679475,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.050403 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.677768416 +0000 UTC m=+1.073822469,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.057109 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.679311009 +0000 UTC m=+1.075365062,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.063623 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.67934131 +0000 UTC m=+1.075395363,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.072581 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f691bcea5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.104642213 +0000 UTC m=+1.500696276,LastTimestamp:2026-03-08 19:31:40.104642213 +0000 UTC m=+1.500696276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.079152 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f691c98c3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.104693955 +0000 UTC m=+1.500748018,LastTimestamp:2026-03-08 19:31:40.104693955 +0000 UTC m=+1.500748018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.085337 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f6955607a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.108415098 +0000 UTC m=+1.504469161,LastTimestamp:2026-03-08 19:31:40.108415098 +0000 UTC m=+1.504469161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.092326 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f6a602f83 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.125900675 +0000 UTC m=+1.521954728,LastTimestamp:2026-03-08 19:31:40.125900675 +0000 UTC m=+1.521954728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.098796 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f6a61b716 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.126000918 +0000 UTC m=+1.522054971,LastTimestamp:2026-03-08 19:31:40.126000918 +0000 UTC m=+1.522054971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.106442 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f9470a20b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.831621643 +0000 UTC m=+2.227675706,LastTimestamp:2026-03-08 19:31:40.831621643 +0000 UTC m=+2.227675706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.113209 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f9485d0ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.83300987 +0000 UTC m=+2.229063933,LastTimestamp:2026-03-08 19:31:40.83300987 +0000 UTC m=+2.229063933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.119762 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f94a3efd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.834983891 +0000 UTC m=+2.231037954,LastTimestamp:2026-03-08 19:31:40.834983891 +0000 UTC m=+2.231037954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.126006 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f94e3ce55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.839169621 +0000 UTC m=+2.235223694,LastTimestamp:2026-03-08 19:31:40.839169621 +0000 UTC m=+2.235223694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.132116 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f94e604df openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.839314655 +0000 UTC m=+2.235368718,LastTimestamp:2026-03-08 19:31:40.839314655 +0000 UTC m=+2.235368718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.138213 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f9581ec46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.849531974 +0000 UTC m=+2.245586037,LastTimestamp:2026-03-08 19:31:40.849531974 +0000 UTC m=+2.245586037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.143597 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f959cc289 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851290761 +0000 UTC m=+2.247344804,LastTimestamp:2026-03-08 19:31:40.851290761 +0000 UTC m=+2.247344804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.149716 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f959f1dad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851445165 +0000 UTC m=+2.247499218,LastTimestamp:2026-03-08 19:31:40.851445165 +0000 UTC m=+2.247499218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.156058 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f95a0b27e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851548798 +0000 UTC m=+2.247602861,LastTimestamp:2026-03-08 19:31:40.851548798 +0000 UTC m=+2.247602861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.160777 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f95e71e77 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.856163959 +0000 UTC m=+2.252218022,LastTimestamp:2026-03-08 19:31:40.856163959 +0000 UTC m=+2.252218022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.166636 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f9650ea1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.863097372 +0000 UTC m=+2.259151425,LastTimestamp:2026-03-08 19:31:40.863097372 +0000 UTC m=+2.259151425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.173040 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48faa1afe87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.195107975 +0000 UTC m=+2.591162028,LastTimestamp:2026-03-08 19:31:41.195107975 +0000 UTC m=+2.591162028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.179225 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fab11e3f1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.211288561 +0000 UTC m=+2.607342614,LastTimestamp:2026-03-08 19:31:41.211288561 +0000 UTC m=+2.607342614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.186200 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fab284366 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.21275479 +0000 UTC m=+2.608808853,LastTimestamp:2026-03-08 19:31:41.21275479 +0000 UTC m=+2.608808853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.193495 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fb5c34fba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.390688186 +0000 UTC m=+2.786742249,LastTimestamp:2026-03-08 19:31:41.390688186 +0000 UTC m=+2.786742249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.200561 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fb623abb8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.397003192 +0000 UTC m=+2.793057225,LastTimestamp:2026-03-08 19:31:41.397003192 +0000 UTC m=+2.793057225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.207636 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fb648c834 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.399435316 +0000 UTC m=+2.795489359,LastTimestamp:2026-03-08 19:31:41.399435316 +0000 UTC m=+2.795489359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.217743 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fb696ddc1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.404552641 +0000 UTC m=+2.800606674,LastTimestamp:2026-03-08 19:31:41.404552641 +0000 UTC m=+2.800606674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.223985 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbaa4e274 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.472580212 +0000 UTC m=+2.868634245,LastTimestamp:2026-03-08 19:31:41.472580212 +0000 UTC m=+2.868634245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.227445 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbbc6c119 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.491577113 +0000 UTC m=+2.887631146,LastTimestamp:2026-03-08 19:31:41.491577113 +0000 UTC m=+2.887631146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.233736 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbbdabaab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.492886187 +0000 UTC m=+2.888940220,LastTimestamp:2026-03-08 19:31:41.492886187 +0000 UTC m=+2.888940220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.240312 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fc6e9c7cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.678421963 +0000 UTC m=+3.074476026,LastTimestamp:2026-03-08 19:31:41.678421963 +0000 UTC m=+3.074476026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.246730 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc6f929d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.6794301 +0000 UTC m=+3.075484123,LastTimestamp:2026-03-08 19:31:41.6794301 +0000 UTC m=+3.075484123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.253048 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fc71435b8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.681202616 +0000 UTC m=+3.077256649,LastTimestamp:2026-03-08 19:31:41.681202616 +0000 UTC m=+3.077256649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.259104 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc71b755b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.681677659 +0000 UTC m=+3.077731682,LastTimestamp:2026-03-08 19:31:41.681677659 +0000 UTC m=+3.077731682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.266004 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc81211ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.697839534 +0000 UTC m=+3.093893557,LastTimestamp:2026-03-08 19:31:41.697839534 +0000 UTC m=+3.093893557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.270576 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc8316ff9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.699895289 +0000 UTC m=+3.095949312,LastTimestamp:2026-03-08 19:31:41.699895289 +0000 UTC m=+3.095949312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.275220 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.275328 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fc860b756 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.70299375 +0000 UTC m=+3.099047803,LastTimestamp:2026-03-08 19:31:41.70299375 +0000 UTC m=+3.099047803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.277764 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fc88b2120 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.705773344 +0000 UTC m=+3.101827387,LastTimestamp:2026-03-08 19:31:41.705773344 +0000 UTC m=+3.101827387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.282887 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc88f8401 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.706060801 +0000 UTC m=+3.102114864,LastTimestamp:2026-03-08 19:31:41.706060801 +0000 UTC m=+3.102114864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.289278 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc8e022c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.71134432 +0000 UTC m=+3.107398343,LastTimestamp:2026-03-08 19:31:41.71134432 +0000 UTC m=+3.107398343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.293313 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fcdfd9bd4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.79716194 +0000 UTC m=+3.193215963,LastTimestamp:2026-03-08 19:31:41.79716194 +0000 UTC m=+3.193215963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.299219 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fceb46a0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.809142286 +0000 UTC m=+3.205196329,LastTimestamp:2026-03-08 19:31:41.809142286 +0000 UTC m=+3.205196329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.305078 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd38e9152 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.89054805 +0000 UTC m=+3.286602063,LastTimestamp:2026-03-08 19:31:41.89054805 +0000 UTC m=+3.286602063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.311857 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd3bfd98c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.893777804 +0000 UTC m=+3.289831847,LastTimestamp:2026-03-08 19:31:41.893777804 +0000 UTC m=+3.289831847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.318128 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd4529776 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.903394678 +0000 UTC m=+3.299448711,LastTimestamp:2026-03-08 19:31:41.903394678 +0000 UTC m=+3.299448711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.322071 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd4646cf9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.904563449 +0000 UTC m=+3.300617472,LastTimestamp:2026-03-08 19:31:41.904563449 +0000 UTC m=+3.300617472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.327677 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd46f26a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.905266337 +0000 UTC m=+3.301320360,LastTimestamp:2026-03-08 19:31:41.905266337 +0000 UTC m=+3.301320360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.331204 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd47fe418 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.906363416 +0000 UTC m=+3.302417439,LastTimestamp:2026-03-08 19:31:41.906363416 +0000 UTC m=+3.302417439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.334402 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fe2621673 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.139291251 +0000 UTC m=+3.535345274,LastTimestamp:2026-03-08 19:31:42.139291251 +0000 UTC m=+3.535345274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.340209 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe2935d63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.142520675 +0000 UTC m=+3.538574708,LastTimestamp:2026-03-08 19:31:42.142520675 +0000 UTC m=+3.538574708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.346051 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe38ecc98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.15899868 +0000 UTC m=+3.555052703,LastTimestamp:2026-03-08 19:31:42.15899868 +0000 UTC m=+3.555052703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.351845 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe39a262d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.159742509 +0000 UTC m=+3.555796532,LastTimestamp:2026-03-08 19:31:42.159742509 +0000 UTC m=+3.555796532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.357878 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fe39d3142 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.159941954 +0000 UTC m=+3.555995977,LastTimestamp:2026-03-08 19:31:42.159941954 +0000 UTC m=+3.555995977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.363888 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fee40ee05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.338444805 +0000 UTC m=+3.734498848,LastTimestamp:2026-03-08 19:31:42.338444805 +0000 UTC m=+3.734498848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.369875 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48feef82f49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.350454601 +0000 UTC m=+3.746508634,LastTimestamp:2026-03-08 19:31:42.350454601 +0000 UTC m=+3.746508634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.376436 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fef09a959 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,LastTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.384410 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48ff3a00668 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.428563048 +0000 UTC m=+3.824617111,LastTimestamp:2026-03-08 19:31:42.428563048 +0000 UTC m=+3.824617111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.392021 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48ffcc977d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.582274006 +0000 UTC m=+3.978328039,LastTimestamp:2026-03-08 19:31:42.582274006 +0000 UTC m=+3.978328039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.399085 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48ffe02c97d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.602807677 +0000 UTC m=+3.998861710,LastTimestamp:2026-03-08 19:31:42.602807677 +0000 UTC m=+3.998861710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.406782 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49000f69ed4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.652341972 +0000 UTC m=+4.048396035,LastTimestamp:2026-03-08 19:31:42.652341972 +0000 UTC m=+4.048396035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.413539 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49001b75ebe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.664974014 +0000 UTC m=+4.061028077,LastTimestamp:2026-03-08 19:31:42.664974014 +0000 UTC m=+4.061028077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.421384 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af490300e4bd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.442422738 +0000 UTC m=+4.838476801,LastTimestamp:2026-03-08 19:31:43.442422738 +0000 UTC m=+4.838476801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.428233 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af4903feb6971 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.708572017 +0000 UTC m=+5.104626080,LastTimestamp:2026-03-08 19:31:43.708572017 +0000 UTC m=+5.104626080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.434594 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49040a3d91a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.720659226 +0000 UTC m=+5.116713289,LastTimestamp:2026-03-08 19:31:43.720659226 +0000 UTC m=+5.116713289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.440727 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49040bcf9e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.722306019 +0000 UTC m=+5.118360072,LastTimestamp:2026-03-08 19:31:43.722306019 +0000 UTC m=+5.118360072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.446507 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49050bda758 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.99078588 +0000 UTC m=+5.386839903,LastTimestamp:2026-03-08 19:31:43.99078588 +0000 UTC m=+5.386839903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.450157 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49051cabc65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.008420453 +0000 UTC m=+5.404474516,LastTimestamp:2026-03-08 19:31:44.008420453 +0000 UTC m=+5.404474516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.454996 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49051e886da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.010372826 +0000 UTC m=+5.406426849,LastTimestamp:2026-03-08 19:31:44.010372826 +0000 UTC m=+5.406426849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.460411 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af490612c9e79 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.266493561 +0000 UTC m=+5.662547614,LastTimestamp:2026-03-08 19:31:44.266493561 +0000 UTC m=+5.662547614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.466229 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49062209e75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.282484341 +0000 UTC m=+5.678538404,LastTimestamp:2026-03-08 19:31:44.282484341 +0000 UTC m=+5.678538404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.470848 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49062397be6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.284113894 +0000 UTC m=+5.680167957,LastTimestamp:2026-03-08 19:31:44.284113894 +0000 UTC m=+5.680167957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.476054 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49071e2babc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.546863804 +0000 UTC m=+5.942917867,LastTimestamp:2026-03-08 19:31:44.546863804 +0000 UTC m=+5.942917867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.480810 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49072d06f32 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.562442034 +0000 UTC m=+5.958496097,LastTimestamp:2026-03-08 19:31:44.562442034 +0000 UTC m=+5.958496097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.485457 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49072ea761b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.564147739 +0000 UTC m=+5.960201802,LastTimestamp:2026-03-08 19:31:44.564147739 +0000 UTC m=+5.960201802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.491766 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49082ff62e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.833954534 +0000 UTC m=+6.230008597,LastTimestamp:2026-03-08 19:31:44.833954534 +0000 UTC m=+6.230008597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.497967 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49083fcfcb0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.850574512 +0000 UTC m=+6.246628565,LastTimestamp:2026-03-08 19:31:44.850574512 +0000 UTC m=+6.246628565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.507452 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-controller-manager-crc.189af49283a15245 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,LastTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.513884 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af49283a23206 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,LastTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.518876 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492a11a4f45 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.928970053 +0000 UTC m=+15.325024086,LastTimestamp:2026-03-08 19:31:53.928970053 +0000 UTC m=+15.325024086,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.523590 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492a11b5436 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.929036854 +0000 UTC m=+15.325090887,LastTimestamp:2026-03-08 19:31:53.929036854 +0000 UTC m=+15.325090887,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.528841 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492dc3986f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 19:32:05 crc kubenswrapper[4885]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 19:32:05 crc kubenswrapper[4885]: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.920871664 +0000 UTC m=+16.316925797,LastTimestamp:2026-03-08 19:31:54.920871664 +0000 UTC m=+16.316925797,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.533682 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492dc3e24f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.921174263 +0000 UTC m=+16.317228346,LastTimestamp:2026-03-08 19:31:54.921174263 +0000 UTC m=+16.317228346,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.540536 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492dd9773b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.943804338 +0000 UTC m=+16.339858441,LastTimestamp:2026-03-08 19:31:54.943804338 +0000 UTC m=+16.339858441,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.544450 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492dd9b640b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.944062475 +0000 UTC m=+16.340116538,LastTimestamp:2026-03-08 19:31:54.944062475 +0000 UTC m=+16.340116538,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.550500 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189af48fef09a959\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fef09a959 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,LastTimestamp:2026-03-08 19:31:55.498344682 +0000 UTC m=+16.894398735,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.559360 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189af49283a15245\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-controller-manager-crc.189af49283a15245 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,LastTimestamp:2026-03-08 19:32:03.435069787 +0000 UTC m=+24.831123860,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.565855 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189af49283a23206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af49283a23206 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,LastTimestamp:2026-03-08 19:32:03.435154949 +0000 UTC m=+24.831209012,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.952075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.952300 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.954632 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.955042 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:06 crc kubenswrapper[4885]: I0308 19:32:06.281187 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:06 crc kubenswrapper[4885]: W0308 19:32:06.337711 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 19:32:06 crc kubenswrapper[4885]: E0308 19:32:06.337779 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:07 crc kubenswrapper[4885]: I0308 19:32:07.279081 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:07 crc kubenswrapper[4885]: W0308 19:32:07.601751 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:07 crc kubenswrapper[4885]: E0308 19:32:07.601824 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.278739 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:08 crc kubenswrapper[4885]: E0308 19:32:08.344786 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.351567 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353574 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:08 crc kubenswrapper[4885]: E0308 19:32:08.355344 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:09 crc kubenswrapper[4885]: I0308 19:32:09.279148 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:09 crc kubenswrapper[4885]: E0308 19:32:09.483980 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:10 crc kubenswrapper[4885]: W0308 19:32:10.046018 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 19:32:10 crc kubenswrapper[4885]: E0308 19:32:10.046092 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:10 crc kubenswrapper[4885]: I0308 19:32:10.279090 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.258774 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.258955 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.262635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.277697 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.554156 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:12 crc kubenswrapper[4885]: I0308 19:32:12.277912 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:13 crc kubenswrapper[4885]: I0308 19:32:13.277586 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:14 crc kubenswrapper[4885]: I0308 19:32:14.278643 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.278604 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:15 crc kubenswrapper[4885]: E0308 19:32:15.353163 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.356183 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358237 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358391 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:15 crc kubenswrapper[4885]: E0308 19:32:15.360737 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:16 crc kubenswrapper[4885]: I0308 19:32:16.279277 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:17 crc kubenswrapper[4885]: I0308 19:32:17.274753 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:18 crc kubenswrapper[4885]: I0308 19:32:18.279455 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.278229 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.367653 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.369869 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:19 crc kubenswrapper[4885]: E0308 19:32:19.484244 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.582680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.275816 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.591667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.592448 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596070 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" exitCode=255 Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204"} Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596216 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596540 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601117 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.602362 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:20 crc kubenswrapper[4885]: E0308 19:32:20.602659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:20 crc kubenswrapper[4885]: W0308 19:32:20.604805 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:20 crc kubenswrapper[4885]: E0308 19:32:20.604857 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:21 crc kubenswrapper[4885]: I0308 19:32:21.278555 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:21 crc kubenswrapper[4885]: I0308 19:32:21.602245 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.278347 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.361678 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.362383 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363469 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.370032 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:22 crc kubenswrapper[4885]: W0308 19:32:22.898807 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.898880 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.279512 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.677177 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.677451 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.678987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679720 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:23 crc kubenswrapper[4885]: E0308 19:32:23.680015 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:24 crc kubenswrapper[4885]: I0308 19:32:24.278511 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:24 crc kubenswrapper[4885]: W0308 19:32:24.687835 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 19:32:24 crc kubenswrapper[4885]: E0308 19:32:24.687955 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.279369 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.952160 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.952422 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.954674 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:25 crc kubenswrapper[4885]: E0308 19:32:25.955103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:26 crc kubenswrapper[4885]: I0308 19:32:26.279805 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:27 crc kubenswrapper[4885]: I0308 19:32:27.279791 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:28 crc kubenswrapper[4885]: I0308 19:32:28.278307 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.279206 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.369838 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.370954 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372642 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.379778 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.484669 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:30 crc kubenswrapper[4885]: I0308 19:32:30.279065 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.046283 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.046471 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.279376 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: I0308 19:32:32.277698 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: W0308 19:32:32.398109 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: E0308 19:32:32.398205 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:33 crc kubenswrapper[4885]: I0308 19:32:33.278584 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:34 crc kubenswrapper[4885]: I0308 19:32:34.279640 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:35 crc kubenswrapper[4885]: I0308 19:32:35.278321 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.278642 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:36 crc kubenswrapper[4885]: E0308 19:32:36.372671 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.380335 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.381946 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382054 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:36 crc kubenswrapper[4885]: E0308 19:32:36.388330 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.278518 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.367653 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.371520 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:37 crc kubenswrapper[4885]: E0308 19:32:37.372048 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:38 crc kubenswrapper[4885]: I0308 19:32:38.277822 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:39 crc kubenswrapper[4885]: I0308 19:32:39.279100 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:39 crc kubenswrapper[4885]: E0308 19:32:39.485687 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:40 crc kubenswrapper[4885]: I0308 19:32:40.286415 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:41 crc kubenswrapper[4885]: I0308 19:32:41.279345 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:42 crc kubenswrapper[4885]: I0308 19:32:42.278223 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.278817 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:43 crc kubenswrapper[4885]: E0308 19:32:43.380966 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.389292 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391872 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391950 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:43 crc kubenswrapper[4885]: E0308 19:32:43.398996 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.077277 4885 csr.go:261] certificate signing request csr-kbpxr is approved, waiting to be issued Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.085429 4885 csr.go:257] certificate signing request csr-kbpxr is issued Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.100879 4885 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.133668 4885 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 19:32:45 crc kubenswrapper[4885]: I0308 19:32:45.087378 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 13:19:33.108853933 +0000 UTC Mar 08 19:32:45 crc kubenswrapper[4885]: I0308 19:32:45.087435 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6761h46m48.02142347s for next certificate rotation Mar 08 19:32:49 crc kubenswrapper[4885]: E0308 19:32:49.486967 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.367159 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368520 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.399543 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400637 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.407904 4885 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.408158 4885 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.408180 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411385 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411484 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.432178 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442728 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442771 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442788 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.457107 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465368 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.482362 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492684 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492732 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509688 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509910 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509983 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.610756 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.711677 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.812227 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.912811 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.013410 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.113975 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.214424 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.315132 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.368202 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.369694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.369966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.370157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.371435 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.416897 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.517790 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.618294 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.692881 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.696463 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e"} Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.696654 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698264 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.718644 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.819327 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.920282 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.021152 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.121500 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.221901 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.325864 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.426015 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.526755 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.627362 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.702405 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.703271 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706012 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" exitCode=255 Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e"} Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706142 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706287 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.708753 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.709189 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.728110 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.828578 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.928891 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.030055 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.130192 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.231216 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.331415 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.432413 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.533216 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.634172 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.677777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.711518 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.714757 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.717630 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.717988 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.734576 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.835166 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.936222 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.036412 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.137351 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.237784 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.338692 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.439770 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.539896 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.640347 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.741136 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.841518 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.942171 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.042628 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.143746 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.244195 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.344726 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.445170 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.545489 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.646670 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.747587 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.847885 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.948312 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.951655 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.951999 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.959028 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.959779 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.048460 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.149299 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.250441 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.351033 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.452101 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.552603 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.653745 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.753911 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.854366 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.954463 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.055556 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.156206 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.256786 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.357563 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.458364 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.558723 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.659364 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.760509 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.861624 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.962825 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.063028 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.163521 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.264486 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.365324 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.466435 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.567842 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.669849 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.770830 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.871895 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.972912 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.073299 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.174426 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.275309 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.375868 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.476426 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.487817 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.576749 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.678471 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.779330 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.880416 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.980717 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.081685 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.182104 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.282883 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.383349 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.483869 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.584253 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.655913 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.663998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664095 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664123 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664192 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.681252 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.709095 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719758 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.735694 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.747420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.747698 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748263 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748512 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.766601 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.767318 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.767486 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.867722 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.968495 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.069018 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.169214 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.270593 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.371544 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.471661 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.572916 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.673601 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.775279 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.875480 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.976994 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.078205 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.161777 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.184623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185083 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185669 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.289901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290339 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290804 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.334113 4885 apiserver.go:52] "Watching apiserver" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.347723 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.348815 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt","openshift-multus/multus-additional-cni-plugins-25vxd","openshift-multus/multus-ff7b4","openshift-network-diagnostics/network-check-target-xd92c","openshift-dns/node-resolver-w5lms","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-node-bssfh","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-image-registry/node-ca-57qch","openshift-machine-config-operator/machine-config-daemon-ttb97","openshift-multus/network-metrics-daemon-jps4r","openshift-network-operator/iptables-alerter-4ln5h"] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.349545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.349678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.350893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351054 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351438 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.351579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351813 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.352067 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.352125 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.352535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354420 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354291 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354599 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.355503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.355706 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.360615 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.360635 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361116 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361317 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361703 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362046 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362323 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362589 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362709 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362607 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363149 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363620 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363662 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363854 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363907 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363861 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364026 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364172 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364278 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364373 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364781 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365207 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365401 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365412 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365547 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365774 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365850 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365959 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366129 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366397 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366526 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366660 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.382453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389352 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389533 4885 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389684 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389722 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389789 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389810 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390003 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390034 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390086 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390109 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390356 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390582 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390746 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390789 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390827 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391057 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.391751 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392148 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392608 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.892106735 +0000 UTC m=+84.288160798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392641 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.892625409 +0000 UTC m=+84.288679472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.393058 4885 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.393693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.394554 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.395459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.397627 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399958 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399972 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.406410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411201 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411252 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411277 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411356 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.911331532 +0000 UTC m=+84.307385595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415173 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415213 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415232 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415297 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.91527818 +0000 UTC m=+84.311332243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.415435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.417239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.421797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.424154 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.425107 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.432399 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.457383 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.464121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.471182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.485764 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3 WatchSource:0}: Error finding container a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3: Status 404 returned error can't find the container with id a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.489833 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491481 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491593 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491633 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491653 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491678 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491764 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491809 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491883 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492076 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493078 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493102 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493146 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493168 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493263 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493356 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493382 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493403 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493425 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493478 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493500 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493527 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493598 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493624 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493651 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493693 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493789 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493886 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.493856 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 19:33:02 crc kubenswrapper[4885]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 19:33:02 crc kubenswrapper[4885]: ho_enable="--enable-hybrid-overlay" Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 19:33:02 crc kubenswrapper[4885]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 19:33:02 crc kubenswrapper[4885]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-host=127.0.0.1 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-port=9743 \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ho_enable} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-approver \ Mar 08 19:33:02 crc kubenswrapper[4885]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --wait-for-kubernetes-api=200s \ Mar 08 19:33:02 crc kubenswrapper[4885]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493913 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494032 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494204 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494277 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494321 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494398 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494509 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494614 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494651 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494729 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494796 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494833 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495003 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495048 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495083 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495205 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495245 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495680 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495840 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496112 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496158 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496249 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496336 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492198 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496430 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492531 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492704 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496479 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496526 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496657 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497185 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497300 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497337 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497374 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497445 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497524 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498987 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499068 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499425 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499537 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500059 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500139 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500332 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500450 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500569 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500608 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501143 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501768 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501952 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502749 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505587 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505598 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506073 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509279 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509397 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493976 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494163 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494382 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494454 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495729 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495823 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496355 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497958 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497952 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498069 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498408 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498760 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498806 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499506 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500040 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500471 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500461 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501849 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502296 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503256 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503383 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504426 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505057 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505652 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505862 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.507086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.507344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508050 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509490 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511084 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511165 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.512686 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.512989 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513909 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514488 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514578 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514429 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.515908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.515996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516168 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.517716 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.017666915 +0000 UTC m=+84.413720948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517806 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518867 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519148 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519493 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519509 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519550 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520065 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520293 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520753 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521588 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521607 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521627 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521913 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522724 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.523009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.524601 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525246 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526389 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527715 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527865 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528012 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528091 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528305 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528409 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528557 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528819 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529386 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529388 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529867 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530278 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530702 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531033 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531096 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531167 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.531625 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.531717 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.031687159 +0000 UTC m=+84.427741392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534261 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535249 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535313 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535350 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535410 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535438 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535467 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535495 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535520 4885 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535552 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535579 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535610 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535640 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535668 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535698 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535729 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535759 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535792 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535820 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535847 4885 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535875 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535905 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535967 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535997 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536025 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536231 4885 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536263 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536277 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536290 4885 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536307 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536321 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536336 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536427 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536446 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536460 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536479 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536494 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536509 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536522 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536536 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.537562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.539743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547782 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547815 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547830 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547846 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547862 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547877 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547895 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547908 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547957 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547968 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547981 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547994 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548006 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548024 4885 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548038 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548053 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548066 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548132 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548146 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548158 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548170 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548182 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548194 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548205 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548224 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548254 4885 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548266 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548276 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548303 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548313 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548324 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548333 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548343 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548353 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548362 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548373 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548448 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548777 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548894 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.550666 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551009 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.551155 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-webhook \ Mar 08 19:33:02 crc kubenswrapper[4885]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551384 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551873 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.552327 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552820 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552969 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553733 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553822 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554018 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554340 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554410 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554496 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554583 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554974 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554987 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555982 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.556159 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.556716 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557056 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557743 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558072 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558250 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559159 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559323 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560801 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560956 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560983 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.561413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562224 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563302 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563720 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563911 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.564052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.569884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.573071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.578652 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.586200 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.595912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.595997 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.601962 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608539 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608570 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650028 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650581 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650689 4885 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650721 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650744 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650765 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650785 4885 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650803 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650822 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650842 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650861 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650881 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650900 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650954 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650974 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650993 4885 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651012 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651032 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651103 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651106 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651155 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651176 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651194 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651211 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651227 4885 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651243 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651261 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651285 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651303 4885 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652100 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652134 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652155 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652177 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651116 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652210 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652231 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652254 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652277 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652298 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652319 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652340 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652360 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652439 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652516 4885 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652541 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652560 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652617 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652638 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652691 4885 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652789 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652811 4885 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652830 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652903 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652995 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653022 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653084 4885 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653105 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653128 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653185 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653203 4885 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653220 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653276 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653294 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653312 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653373 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653392 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653412 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653468 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653485 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653503 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653559 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653576 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653653 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653673 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653690 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653750 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653768 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653785 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653842 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653859 4885 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653877 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653993 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654023 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654086 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654103 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654123 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654183 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654202 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654257 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654276 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654293 4885 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654345 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654367 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654385 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654402 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654419 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654438 4885 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654455 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654515 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654533 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654613 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654631 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654648 4885 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654665 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654682 4885 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654700 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654717 4885 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654734 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654750 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654802 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654822 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654839 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654855 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654905 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655312 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655341 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655359 4885 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.673855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.685488 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.699786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.706161 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c WatchSource:0}: Error finding container 32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c: Status 404 returned error can't find the container with id 32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.712014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.712033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.712245 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: source /etc/kubernetes/apiserver-url.env Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.713521 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.725353 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc890659_71a7_4024_bae6_e1e1ef563f17.slice/crio-618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8 WatchSource:0}: Error finding container 618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8: Status 404 returned error can't find the container with id 618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8 Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.728940 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -uo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: HOSTS_FILE="/etc/hosts" Mar 08 19:33:02 crc kubenswrapper[4885]: TEMP_FILE="/etc/hosts.tmp" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Make a temporary file with the old hosts file's attributes. Mar 08 19:33:02 crc kubenswrapper[4885]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Failed to preserve hosts file. Exiting." Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: while true; do Mar 08 19:33:02 crc kubenswrapper[4885]: declare -A svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${services[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 19:33:02 crc kubenswrapper[4885]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 19:33:02 crc kubenswrapper[4885]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 19:33:02 crc kubenswrapper[4885]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 19:33:02 crc kubenswrapper[4885]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 19:33:02 crc kubenswrapper[4885]: for i in ${!cmds[*]} Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: ips=($(eval "${cmds[i]}")) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: svc_ips["${svc}"]="${ips[@]}" Mar 08 19:33:02 crc kubenswrapper[4885]: break Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Update /etc/hosts only if we get valid service IPs Mar 08 19:33:02 crc kubenswrapper[4885]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 19:33:02 crc kubenswrapper[4885]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 19:33:02 crc kubenswrapper[4885]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Append resolver entries for services Mar 08 19:33:02 crc kubenswrapper[4885]: rc=0 Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${!svc_ips[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: for ip in ${svc_ips[${svc}]}; do Mar 08 19:33:02 crc kubenswrapper[4885]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ $rc -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 19:33:02 crc kubenswrapper[4885]: # Replace /etc/hosts with our modified version if needed Mar 08 19:33:02 crc kubenswrapper[4885]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 19:33:02 crc kubenswrapper[4885]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: unset svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6kv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w5lms_openshift-dns(bc890659-71a7-4024-bae6-e1e1ef563f17): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.730111 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w5lms" podUID="bc890659-71a7-4024-bae6-e1e1ef563f17" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.743621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.744351 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5lms" event={"ID":"bc890659-71a7-4024-bae6-e1e1ef563f17","Type":"ContainerStarted","Data":"618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.746269 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -uo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: HOSTS_FILE="/etc/hosts" Mar 08 19:33:02 crc kubenswrapper[4885]: TEMP_FILE="/etc/hosts.tmp" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Make a temporary file with the old hosts file's attributes. Mar 08 19:33:02 crc kubenswrapper[4885]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Failed to preserve hosts file. Exiting." Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: while true; do Mar 08 19:33:02 crc kubenswrapper[4885]: declare -A svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${services[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 19:33:02 crc kubenswrapper[4885]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 19:33:02 crc kubenswrapper[4885]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 19:33:02 crc kubenswrapper[4885]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 19:33:02 crc kubenswrapper[4885]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 19:33:02 crc kubenswrapper[4885]: for i in ${!cmds[*]} Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: ips=($(eval "${cmds[i]}")) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: svc_ips["${svc}"]="${ips[@]}" Mar 08 19:33:02 crc kubenswrapper[4885]: break Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Update /etc/hosts only if we get valid service IPs Mar 08 19:33:02 crc kubenswrapper[4885]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 19:33:02 crc kubenswrapper[4885]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 19:33:02 crc kubenswrapper[4885]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Append resolver entries for services Mar 08 19:33:02 crc kubenswrapper[4885]: rc=0 Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${!svc_ips[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: for ip in ${svc_ips[${svc}]}; do Mar 08 19:33:02 crc kubenswrapper[4885]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ $rc -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 19:33:02 crc kubenswrapper[4885]: # Replace /etc/hosts with our modified version if needed Mar 08 19:33:02 crc kubenswrapper[4885]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 19:33:02 crc kubenswrapper[4885]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: unset svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6kv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w5lms_openshift-dns(bc890659-71a7-4024-bae6-e1e1ef563f17): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.746418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.747469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w5lms" podUID="bc890659-71a7-4024-bae6-e1e1ef563f17" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.749558 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.752049 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: source /etc/kubernetes/apiserver-url.env Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.752782 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 19:33:02 crc kubenswrapper[4885]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 19:33:02 crc kubenswrapper[4885]: ho_enable="--enable-hybrid-overlay" Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 19:33:02 crc kubenswrapper[4885]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 19:33:02 crc kubenswrapper[4885]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-host=127.0.0.1 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-port=9743 \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ho_enable} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-approver \ Mar 08 19:33:02 crc kubenswrapper[4885]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --wait-for-kubernetes-api=200s \ Mar 08 19:33:02 crc kubenswrapper[4885]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.753253 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.754540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.760018 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-webhook \ Mar 08 19:33:02 crc kubenswrapper[4885]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.761785 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.762678 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.766914 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8 WatchSource:0}: Error finding container 91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8: Status 404 returned error can't find the container with id 91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.772411 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.772610 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.773829 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.780864 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac72c25_d3e6_4dda_8444_6cd4442af7e4.slice/crio-b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e WatchSource:0}: Error finding container b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e: Status 404 returned error can't find the container with id b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.781671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.792957 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 19:33:02 crc kubenswrapper[4885]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 19:33:02 crc kubenswrapper[4885]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.794953 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.796476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.802137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.811441 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 19:33:02 crc kubenswrapper[4885]: apiVersion: v1 Mar 08 19:33:02 crc kubenswrapper[4885]: clusters: Mar 08 19:33:02 crc kubenswrapper[4885]: - cluster: Mar 08 19:33:02 crc kubenswrapper[4885]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: server: https://api-int.crc.testing:6443 Mar 08 19:33:02 crc kubenswrapper[4885]: name: default-cluster Mar 08 19:33:02 crc kubenswrapper[4885]: contexts: Mar 08 19:33:02 crc kubenswrapper[4885]: - context: Mar 08 19:33:02 crc kubenswrapper[4885]: cluster: default-cluster Mar 08 19:33:02 crc kubenswrapper[4885]: namespace: default Mar 08 19:33:02 crc kubenswrapper[4885]: user: default-auth Mar 08 19:33:02 crc kubenswrapper[4885]: name: default-context Mar 08 19:33:02 crc kubenswrapper[4885]: current-context: default-context Mar 08 19:33:02 crc kubenswrapper[4885]: kind: Config Mar 08 19:33:02 crc kubenswrapper[4885]: preferences: {} Mar 08 19:33:02 crc kubenswrapper[4885]: users: Mar 08 19:33:02 crc kubenswrapper[4885]: - name: default-auth Mar 08 19:33:02 crc kubenswrapper[4885]: user: Mar 08 19:33:02 crc kubenswrapper[4885]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:02 crc kubenswrapper[4885]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:02 crc kubenswrapper[4885]: EOF Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.811730 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.812559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.814879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815167 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.817192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.820584 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33b5271_bda3_41ca_81a3_d47fff657c27.slice/crio-f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b WatchSource:0}: Error finding container f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b: Status 404 returned error can't find the container with id f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.824706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.827959 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -euo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 19:33:02 crc kubenswrapper[4885]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 19:33:02 crc kubenswrapper[4885]: # As the secret mount is optional we must wait for the files to be present. Mar 08 19:33:02 crc kubenswrapper[4885]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 19:33:02 crc kubenswrapper[4885]: TS=$(date +%s) Mar 08 19:33:02 crc kubenswrapper[4885]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 19:33:02 crc kubenswrapper[4885]: HAS_LOGGED_INFO=0 Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: log_missing_certs(){ Mar 08 19:33:02 crc kubenswrapper[4885]: CUR_TS=$(date +%s) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 19:33:02 crc kubenswrapper[4885]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 19:33:02 crc kubenswrapper[4885]: HAS_LOGGED_INFO=1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: } Mar 08 19:33:02 crc kubenswrapper[4885]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 19:33:02 crc kubenswrapper[4885]: log_missing_certs Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 5 Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/kube-rbac-proxy \ Mar 08 19:33:02 crc kubenswrapper[4885]: --logtostderr \ Mar 08 19:33:02 crc kubenswrapper[4885]: --secure-listen-address=:9108 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --upstream=http://127.0.0.1:29108/ \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-private-key-file=${TLS_PK} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-cert-file=${TLS_CERT} Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.831778 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "false" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: persistent_ips_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # This is needed so that converting clusters from GA to TP Mar 08 19:33:02 crc kubenswrapper[4885]: # will rollout control plane pods as well Mar 08 19:33:02 crc kubenswrapper[4885]: network_segmentation_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: multi_network_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: multi_network_enabled_flag="--enable-multi-network" Mar 08 19:33:02 crc kubenswrapper[4885]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-enable-pprof \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-enable-config-duration \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v4_join_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v6_join_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${dns_name_resolver_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${persistent_ips_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${multi_network_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${network_segmentation_enabled_flag} Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.832867 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5dda3b_3e01_4bb4_af02_b0f4eeadda58.slice/crio-3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f WatchSource:0}: Error finding container 3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f: Status 404 returned error can't find the container with id 3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.833153 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podUID="b33b5271-bda3-41ca-81a3-d47fff657c27" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.836263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.836344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.837155 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.843644 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.845024 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.849332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.852654 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cfac2d6_6888_4b2d_982e_826f583396e8.slice/crio-1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47 WatchSource:0}: Error finding container 1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47: Status 404 returned error can't find the container with id 1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47 Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.856349 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: while [ true ]; Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: for f in $(ls /tmp/serviceca); do Mar 08 19:33:02 crc kubenswrapper[4885]: echo $f Mar 08 19:33:02 crc kubenswrapper[4885]: ca_file_path="/tmp/serviceca/${f}" Mar 08 19:33:02 crc kubenswrapper[4885]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 19:33:02 crc kubenswrapper[4885]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 19:33:02 crc kubenswrapper[4885]: if [ -e "${reg_dir_path}" ]; then Mar 08 19:33:02 crc kubenswrapper[4885]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: mkdir $reg_dir_path Mar 08 19:33:02 crc kubenswrapper[4885]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: for d in $(ls /etc/docker/certs.d); do Mar 08 19:33:02 crc kubenswrapper[4885]: echo $d Mar 08 19:33:02 crc kubenswrapper[4885]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 19:33:02 crc kubenswrapper[4885]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 19:33:02 crc kubenswrapper[4885]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 19:33:02 crc kubenswrapper[4885]: rm -rf /etc/docker/certs.d/$d Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait ${!} Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r95ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-57qch_openshift-image-registry(0cfac2d6-6888-4b2d-982e-826f583396e8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.858292 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-57qch" podUID="0cfac2d6-6888-4b2d-982e-826f583396e8" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.860353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.861439 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.883332 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zkpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-25vxd_openshift-multus(ac600107-0c97-4ec8-89f6-598b40c166ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.890074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podUID="ac600107-0c97-4ec8-89f6-598b40c166ee" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.909267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926221 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.934650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.947433 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.959135 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963240 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963306 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963393 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963455 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963468 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963488 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963461879 +0000 UTC m=+85.359515942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963489 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963607 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963614 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963585682 +0000 UTC m=+85.359639705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963488 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963757 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963782 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963672 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963649814 +0000 UTC m=+85.359703847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963859 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963831979 +0000 UTC m=+85.359886002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.973096 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.991902 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.003352 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.018677 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029057 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029086 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.032716 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.047258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.058664 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.064116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.064351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064547 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064611 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:04.064592549 +0000 UTC m=+85.460646572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064674 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:04.064667831 +0000 UTC m=+85.460721854 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.067646 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.076875 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.086818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.109453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.122348 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.139294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.150185 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.164175 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.180126 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235506 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235531 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235590 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.339032 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.375363 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.375961 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.377496 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.378273 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.379426 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.379992 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.380656 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.381969 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.382732 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.383848 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.384553 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.385830 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.386359 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.387024 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.388128 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.388715 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.389738 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.390188 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.390759 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.392313 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.392876 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.393888 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.394390 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.395489 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.396023 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.396695 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.398554 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.399627 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.400878 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.401891 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.402907 4885 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.403156 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.406002 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.407762 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.409172 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.413704 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.416282 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.418575 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.420118 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.422623 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.424175 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.426863 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.428652 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.430728 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.432969 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.437885 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.440184 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.443323 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444616 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444828 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.446516 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.447546 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.449596 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.450836 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.451857 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547504 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547524 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.650956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651029 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651100 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.753981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754032 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754092 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.755347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"54661ff92d86d446f0561f70be37d97fffc952cd7edc4f3f4e212f70264f4183"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.757168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.758855 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 19:33:03 crc kubenswrapper[4885]: apiVersion: v1 Mar 08 19:33:03 crc kubenswrapper[4885]: clusters: Mar 08 19:33:03 crc kubenswrapper[4885]: - cluster: Mar 08 19:33:03 crc kubenswrapper[4885]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: server: https://api-int.crc.testing:6443 Mar 08 19:33:03 crc kubenswrapper[4885]: name: default-cluster Mar 08 19:33:03 crc kubenswrapper[4885]: contexts: Mar 08 19:33:03 crc kubenswrapper[4885]: - context: Mar 08 19:33:03 crc kubenswrapper[4885]: cluster: default-cluster Mar 08 19:33:03 crc kubenswrapper[4885]: namespace: default Mar 08 19:33:03 crc kubenswrapper[4885]: user: default-auth Mar 08 19:33:03 crc kubenswrapper[4885]: name: default-context Mar 08 19:33:03 crc kubenswrapper[4885]: current-context: default-context Mar 08 19:33:03 crc kubenswrapper[4885]: kind: Config Mar 08 19:33:03 crc kubenswrapper[4885]: preferences: {} Mar 08 19:33:03 crc kubenswrapper[4885]: users: Mar 08 19:33:03 crc kubenswrapper[4885]: - name: default-auth Mar 08 19:33:03 crc kubenswrapper[4885]: user: Mar 08 19:33:03 crc kubenswrapper[4885]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:03 crc kubenswrapper[4885]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:03 crc kubenswrapper[4885]: EOF Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.760074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.760081 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.760516 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.762650 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.762887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"c7b581b664bb832fbfee0daba19f0962b65d3323b906586299c1c0a80cddcb36"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.763053 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.764020 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.764219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.765091 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zkpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-25vxd_openshift-multus(ac600107-0c97-4ec8-89f6-598b40c166ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.765429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.766168 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podUID="ac600107-0c97-4ec8-89f6-598b40c166ee" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.767315 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 19:33:03 crc kubenswrapper[4885]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 19:33:03 crc kubenswrapper[4885]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.767322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-57qch" event={"ID":"0cfac2d6-6888-4b2d-982e-826f583396e8","Type":"ContainerStarted","Data":"1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.768416 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.769121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.769175 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 19:33:03 crc kubenswrapper[4885]: while [ true ]; Mar 08 19:33:03 crc kubenswrapper[4885]: do Mar 08 19:33:03 crc kubenswrapper[4885]: for f in $(ls /tmp/serviceca); do Mar 08 19:33:03 crc kubenswrapper[4885]: echo $f Mar 08 19:33:03 crc kubenswrapper[4885]: ca_file_path="/tmp/serviceca/${f}" Mar 08 19:33:03 crc kubenswrapper[4885]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 19:33:03 crc kubenswrapper[4885]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 19:33:03 crc kubenswrapper[4885]: if [ -e "${reg_dir_path}" ]; then Mar 08 19:33:03 crc kubenswrapper[4885]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: else Mar 08 19:33:03 crc kubenswrapper[4885]: mkdir $reg_dir_path Mar 08 19:33:03 crc kubenswrapper[4885]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: for d in $(ls /etc/docker/certs.d); do Mar 08 19:33:03 crc kubenswrapper[4885]: echo $d Mar 08 19:33:03 crc kubenswrapper[4885]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 19:33:03 crc kubenswrapper[4885]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 19:33:03 crc kubenswrapper[4885]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 19:33:03 crc kubenswrapper[4885]: rm -rf /etc/docker/certs.d/$d Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: sleep 60 & wait ${!} Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r95ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-57qch_openshift-image-registry(0cfac2d6-6888-4b2d-982e-826f583396e8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.770564 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-57qch" podUID="0cfac2d6-6888-4b2d-982e-826f583396e8" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.770762 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:03 crc kubenswrapper[4885]: set -euo pipefail Mar 08 19:33:03 crc kubenswrapper[4885]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 19:33:03 crc kubenswrapper[4885]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 19:33:03 crc kubenswrapper[4885]: # As the secret mount is optional we must wait for the files to be present. Mar 08 19:33:03 crc kubenswrapper[4885]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 19:33:03 crc kubenswrapper[4885]: TS=$(date +%s) Mar 08 19:33:03 crc kubenswrapper[4885]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 19:33:03 crc kubenswrapper[4885]: HAS_LOGGED_INFO=0 Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: log_missing_certs(){ Mar 08 19:33:03 crc kubenswrapper[4885]: CUR_TS=$(date +%s) Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 19:33:03 crc kubenswrapper[4885]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 19:33:03 crc kubenswrapper[4885]: HAS_LOGGED_INFO=1 Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: } Mar 08 19:33:03 crc kubenswrapper[4885]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 19:33:03 crc kubenswrapper[4885]: log_missing_certs Mar 08 19:33:03 crc kubenswrapper[4885]: sleep 5 Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 19:33:03 crc kubenswrapper[4885]: exec /usr/bin/kube-rbac-proxy \ Mar 08 19:33:03 crc kubenswrapper[4885]: --logtostderr \ Mar 08 19:33:03 crc kubenswrapper[4885]: --secure-listen-address=:9108 \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 19:33:03 crc kubenswrapper[4885]: --upstream=http://127.0.0.1:29108/ \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-private-key-file=${TLS_PK} \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-cert-file=${TLS_CERT} Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.773813 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:03 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:03 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "false" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: persistent_ips_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: # This is needed so that converting clusters from GA to TP Mar 08 19:33:03 crc kubenswrapper[4885]: # will rollout control plane pods as well Mar 08 19:33:03 crc kubenswrapper[4885]: network_segmentation_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: multi_network_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: multi_network_enabled_flag="--enable-multi-network" Mar 08 19:33:03 crc kubenswrapper[4885]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 19:33:03 crc kubenswrapper[4885]: exec /usr/bin/ovnkube \ Mar 08 19:33:03 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:03 crc kubenswrapper[4885]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 19:33:03 crc kubenswrapper[4885]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-enable-pprof \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-enable-config-duration \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v4_join_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v6_join_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${dns_name_resolver_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${persistent_ips_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${multi_network_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${network_segmentation_enabled_flag} Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.775010 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podUID="b33b5271-bda3-41ca-81a3-d47fff657c27" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.777882 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.793624 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.808795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.823493 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.837884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.850542 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858246 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858332 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.859254 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.867682 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.881179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.897817 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.924254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.940387 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.959802 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.961975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962025 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962042 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962089 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.973248 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.975007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.975166 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975458 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975597 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.975560948 +0000 UTC m=+87.371615011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975720 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975763 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975786 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976003 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.975865386 +0000 UTC m=+87.371919439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.976161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.976232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976374 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976428 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.976411762 +0000 UTC m=+87.372465815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977103 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977159 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977192 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977280 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.977254544 +0000 UTC m=+87.373308827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.989392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.007082 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.019597 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.030803 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.049296 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065691 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.067579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.077634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.077957 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:06.077871331 +0000 UTC m=+87.473925384 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.078168 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.078351 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.078439 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:06.078422966 +0000 UTC m=+87.474477029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.099776 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.114359 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.137078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.155916 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170498 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.174292 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.194680 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.205804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.220411 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.230848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274416 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368476 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.368633 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368646 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.368772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368642 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.369297 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.370010 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377885 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377904 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.391438 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482145 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482191 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586185 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586337 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690592 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794701 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899184 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899351 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002804 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106888 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210852 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.314013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.314036 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417365 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520644 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520706 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.624021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.624043 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727791 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831188 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934669 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934710 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002229 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002488 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002530 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002538 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002572 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002573 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002649 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002703 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002724 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002608 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002587194 +0000 UTC m=+91.398641247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002887 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002802939 +0000 UTC m=+91.398856992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002992 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002912892 +0000 UTC m=+91.398967025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.003074 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.003020025 +0000 UTC m=+91.399074088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038262 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038363 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.103136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103435 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.103365415 +0000 UTC m=+91.499419468 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.103543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103701 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103788 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.103767455 +0000 UTC m=+91.499821518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141363 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.245015 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349855 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367131 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367143 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367207 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367441 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367646 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367898 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.368146 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558597 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558907 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.559016 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663562 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663707 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767912 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767977 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871491 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974962 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078878 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182226 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286194 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.384353 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.384357 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:33:07 crc kubenswrapper[4885]: E0308 19:33:07.384604 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388513 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388566 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499410 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604606 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604632 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708998 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.783305 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:07 crc kubenswrapper[4885]: E0308 19:33:07.783543 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812967 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916222 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916266 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019697 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019721 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123825 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123980 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.124016 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331324 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367631 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.367803 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368243 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368499 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.432171 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434363 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434406 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434425 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536521 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536549 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639206 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742887 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846492 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846509 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.950018 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.258696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.259078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.259985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.260148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.260258 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369593 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.383329 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.404068 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.417827 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.432416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.444700 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.472723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473750 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.489083 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.507303 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.520370 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.548514 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.565574 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.576847 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577541 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577650 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577709 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.585837 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.593311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.605105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.615418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681245 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681433 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784276 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887246 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887284 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.991292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992156 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.022863 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.054821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.054969 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.055030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.055100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055180 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055234 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055317 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055364 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055392 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055335 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055295218 +0000 UTC m=+99.451349281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055528 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055495743 +0000 UTC m=+99.451549976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055565 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055553864 +0000 UTC m=+99.451608157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055564 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055615 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055641 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055751 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055723809 +0000 UTC m=+99.451777862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095672 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.156373 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.156633 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.156588473 +0000 UTC m=+99.552642546 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.156976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.157178 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.157274 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.157252581 +0000 UTC m=+99.553306694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198291 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368091 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368059 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368288 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368393 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368491 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368575 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513236 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513256 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720126 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926620 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926643 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.928902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.928985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929036 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929058 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.942948 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948205 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.963316 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968430 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.980074 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984792 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984872 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984961 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.001815 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007243 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007261 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.023436 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.023696 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030353 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133521 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133550 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236036 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236131 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236149 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339728 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339772 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443869 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547711 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651528 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651587 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755428 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858824 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962210 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064718 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064774 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168139 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168162 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273938 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273994 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367214 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367443 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367653 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367775 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367865 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379294 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482632 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586506 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586686 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690658 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795028 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.796143 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899567 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899611 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899631 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003121 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003220 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107225 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210647 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.315007 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421473 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525383 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629484 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733527 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.804704 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5lms" event={"ID":"bc890659-71a7-4024-bae6-e1e1ef563f17","Type":"ContainerStarted","Data":"b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.821528 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.832835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838283 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.861979 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.880051 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.894426 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.910045 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.921245 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.937789 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941362 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941482 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.960048 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.988324 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.003863 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.024369 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.034403 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046738 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046757 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.057824 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.074313 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.089909 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150249 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254488 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358401 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358498 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.367799 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368179 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368536 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368647 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368983 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565781 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565913 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.670030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877209 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877275 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877345 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.980328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.980755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981024 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981357 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.084979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085102 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.187966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188052 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188100 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.290954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291018 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291037 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291085 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396859 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500000 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.707994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708635 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816481 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.822347 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e" exitCode=0 Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.822419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.826781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.826842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.839993 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.848723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.866364 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.880579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.894883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.909218 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921696 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.925809 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.935674 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.949638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.973638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.985781 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.015062 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.022587 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.030848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.040327 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.057209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.073357 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.089274 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.105104 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127915 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.131795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.148238 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.171479 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.199823 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.228183 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230714 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.247162 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.263343 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.283567 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.300689 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.331126 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334369 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.350371 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.367355 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367764 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367778 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.367990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.368089 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.368165 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.368419 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.368539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.383179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440891 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543459 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749440 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.832712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.835341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.835428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841173 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853207 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.856052 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.877531 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.891726 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.914936 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.932213 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.950063 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956825 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956847 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.968158 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.990294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.006154 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.023024 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.056734 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059835 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059854 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.080594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.101212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.117976 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.139091 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.157323 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162971 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.180843 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.195818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.218610 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.236134 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.255589 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266613 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.277986 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.313210 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.331122 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.360215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369323 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.406007 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.421440 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.442804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.459198 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472190 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.473392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.487358 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.497867 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.575016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.575030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678199 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781400 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.847026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.847095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.869733 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884552 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.889157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.910498 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.939656 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.965025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.983359 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988683 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988709 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988727 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.000778 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.027063 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.047769 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.068828 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091111 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091626 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.117994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.133553 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.148740 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149538 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149563 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149599 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149620 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149626 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149637 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149617942 +0000 UTC m=+115.545671965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149648 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149664 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149666 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149654683 +0000 UTC m=+115.545708936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149631 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149726 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149705005 +0000 UTC m=+115.545759038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149757 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149741206 +0000 UTC m=+115.545795229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.171877 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.184889 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.250219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.250424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250476 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.250441795 +0000 UTC m=+115.646495828 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250605 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250679 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.25065743 +0000 UTC m=+115.646711513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.367623 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.367767 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368133 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368188 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368433 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368584 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368723 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.398018 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.398033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501707 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501789 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605742 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605759 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605800 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709769 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812876 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.852634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.857584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.876596 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.893059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.910758 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916894 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.930840 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.944367 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.963910 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.978971 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.990574 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.004561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019122 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.024798 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.054247 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.074149 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.097510 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.116471 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124692 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.132384 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.149345 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.164464 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.179003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.193459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.213405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229639 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.295638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.318749 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.329374 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330899 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.341501 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.349808 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.363819 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.381339 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.406300 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.421275 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435606 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.442442 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.455187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.470428 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.479568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.495056 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.504992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.517992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.532570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537670 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537679 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.557041 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.567903 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.577475 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.589735 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.602161 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.621848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.635639 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639642 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639722 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.651205 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.663157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.673726 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.743010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.743027 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847027 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847182 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.863103 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-57qch" event={"ID":"0cfac2d6-6888-4b2d-982e-826f583396e8","Type":"ContainerStarted","Data":"d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.868720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.873823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.882670 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.898215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.920192 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.943252 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950242 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950321 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.963305 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.976611 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.988228 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.001689 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.013257 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.031278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.043818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052690 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.061192 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.075125 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.093607 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.112171 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.130620 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.149287 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154834 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154863 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.164586 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.179983 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.195147 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.206570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.230209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.249948 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257747 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.265340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.281892 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.295410 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.315219 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.332979 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360081 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.363298 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.367773 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.367863 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.367984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368027 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.368114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.368182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368252 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368414 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.381182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.402443 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.416835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462740 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462875 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566537 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669625 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669742 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669762 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773901 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.880367 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152" exitCode=0 Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.880473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.876794 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.883604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885595 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885631 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.907988 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.925202 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.941431 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998874 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.999186 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.015332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.031623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.048762 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.082155 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107665 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107908 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.122484 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.138907 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.155444 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.171381 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.196747 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.212949 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.212997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213037 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213053 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.219137 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.245406 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316808 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326221 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.342013 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.350011 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.366103 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.368879 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.369247 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373326 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.394046 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400514 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400542 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400561 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.421338 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427288 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427307 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.454052 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.454274 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456285 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.559886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.559996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560041 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560059 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664315 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767949 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.768013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.768026 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870604 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.887050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895764 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.896013 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.918277 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.938464 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.938568 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.942182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.963205 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974409 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.994704 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.014740 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.029711 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.045885 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.060994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078375 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.081320 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.093580 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.123269 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.140097 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.161706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.172021 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181983 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.185613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.203903 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.226003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.243714 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.261584 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284888 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.291901 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.316239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.332231 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.357606 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368444 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368475 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368781 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.375731 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387578 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.399990 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.413508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.430884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.452392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.467580 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.484721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491359 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491401 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.504711 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.536227 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594804 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594835 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594857 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697562 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697610 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801578 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904360 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904428 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904475 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.905231 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca" exitCode=0 Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.905300 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.925010 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.938609 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.969647 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.004170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007399 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.027248 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.060690 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.075707 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.099263 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112395 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112495 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.115575 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.134578 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.149283 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.169402 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.184078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.199602 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.212661 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214784 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214870 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.222543 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323904 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.324019 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.324033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426547 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426577 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529393 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632423 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.741025 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.741044 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844263 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844343 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844399 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.914860 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d" exitCode=0 Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.915680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.938668 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947774 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947824 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.955349 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.978508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.994496 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.014042 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.028548 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051648 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.063355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.085743 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.103476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.135328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154312 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154341 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.167992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.202568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.228897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.241702 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256941 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.257666 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.270363 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.367994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.368004 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368122 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.368141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368453 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.472894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.472983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473031 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473050 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576783 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.679999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680161 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783905 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886828 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.922883 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb" exitCode=0 Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.922990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.926306 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.935764 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" exitCode=1 Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.935857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.937067 4885 scope.go:117] "RemoveContainer" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.954650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.978348 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990838 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.996998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.030719 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.050198 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.066509 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.081865 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094022 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094115 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.096035 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.111174 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.125449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.147053 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.161877 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.179106 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.194177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197354 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.218795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.231552 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.245718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.264523 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.277032 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.293859 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302344 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.307379 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.320199 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.336459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.353575 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.372418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.387187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405475 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.415141 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.435231 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.455623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.472600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.488980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.507638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508534 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620489 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620599 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724266 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724319 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826812 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929999 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.947175 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0" exitCode=0 Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.947258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.950677 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.953996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.954410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.964561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.980886 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.996899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.015613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.033024 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.033043 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.045651 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.068593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.085137 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.098179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.112240 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.130798 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.145476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.158499 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.174998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.188456 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.206462 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.219220 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.234755 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239209 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.250883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.263311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.278241 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.300102 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.331267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.344206 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.361070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367416 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.367575 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367957 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367992 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368041 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.368169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368310 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.375529 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.389586 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.400117 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.414717 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.425233 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.439897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.445963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446028 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446101 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.450212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.483318 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653965 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756792 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.860845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861017 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861052 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861118 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.961524 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963077 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963838 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967306 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" exitCode=1 Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967433 4885 scope.go:117] "RemoveContainer" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.968957 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.969369 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.983119 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5" exitCode=0 Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.983201 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.019191 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.048718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067597 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.068650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.086002 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.102980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.121357 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.134804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.162344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170868 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170909 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.178773 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.202404 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.218405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.236393 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.253254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.273330 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274507 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.295851 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.316554 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.336582 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.350657 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379831 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.384046 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.403391 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.426100 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.441291 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.462562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483321 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.484003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.484022 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.504568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.523949 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.543967 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.576627 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.586870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587569 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.603684 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.625950 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.647518 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.665804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691224 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691284 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691308 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795341 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898951 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.995645 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.000860 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.000944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001141 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.004079 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.004425 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.009238 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.025667 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.048109 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.066560 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.098344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104729 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104751 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.120239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.135561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.153686 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.167989 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.188792 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.202223 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207497 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.221847 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.242130 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.258986 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.281178 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.305355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311161 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311213 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.337150 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.356658 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.367950 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368195 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368236 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.376787 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.395350 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.437096 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.460312 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.479487 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.496213 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.513155 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.534356 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.549521 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.572599 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.592387 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.608528 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626649 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626693 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.629483 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.647028 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.678742 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730547 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834620 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834767 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938601 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.041989 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145524 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248857 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.394161 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.411477 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.432673 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456840 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.464786 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.483118 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.507725 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.523590 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.542832 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562111 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562130 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562154 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562109 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.584266 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.607139 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.647773 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665213 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665293 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.683974 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.706182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.720810 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.735985 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768814 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872711 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976281 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976400 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976419 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079986 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183816 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.287902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288017 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288107 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368078 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368135 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368135 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368272 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368572 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368752 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391958 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496055 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496197 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.598830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599282 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599783 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.703085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704536 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.806908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807740 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911240 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911361 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.014993 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015057 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015121 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118252 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.221992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222626 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.325866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326030 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326087 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429463 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429527 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532819 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532872 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.636016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.636520 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661601 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.682649 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688293 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688432 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.712098 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717624 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.739186 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.745029 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.745058 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.764455 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770340 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.798722 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.798898 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801703 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.905008 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007748 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111763 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215660 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.319794 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320084 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320186 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.367996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368009 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368481 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368656 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368915 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.369049 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.369867 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423673 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423735 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.528005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.528025 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631260 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737573 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.842962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843443 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843485 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947633 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947677 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.031759 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.034116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.035422 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055364 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055383 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.063207 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.089810 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.108098 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.126170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.146504 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159719 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159766 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.227420 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.249229 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262366 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262570 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.265779 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.277571 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.300755 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.318972 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.332419 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.346329 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.360937 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.365050 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.381774 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.396043 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468583 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572159 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676647 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780568 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780901 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.884911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885306 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.989060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.989241 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092462 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195485 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.206914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207011 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207048 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207249 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207301 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207327 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207345 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207347443 +0000 UTC m=+147.603401496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207388 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207485 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207506 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207418 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207396434 +0000 UTC m=+147.603450487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207621 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207585559 +0000 UTC m=+147.603639612 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207670 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207768 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207718862 +0000 UTC m=+147.603772925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299988 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.307707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.307977 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.307906018 +0000 UTC m=+147.703960071 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.308197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.308444 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.308557 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.308528334 +0000 UTC m=+147.704582427 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.367772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.368149 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.368099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.367869 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368585 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368774 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368907 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.369587 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.385203 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.403784 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.403911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404043 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404067 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507518 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.610780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611685 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.716116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.716262 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.819889 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821447 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821645 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925664 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028454 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131981 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235707 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339164 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.443085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.443215 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546389 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649732 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753593 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857284 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857325 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961473 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064769 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167852 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167900 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167950 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271812 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.368106 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.368879 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.369074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.369184 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376963 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480472 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480541 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687403 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790990 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893587 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996491 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100514 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100532 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208528 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.311985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312124 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414849 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518833 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621510 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724350 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.827882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.827986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828092 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035726 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139458 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243070 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347371 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367368 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368061 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368245 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368331 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368397 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.452287 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.452543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556873 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.660754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661435 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661553 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764852 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867794 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.970911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971041 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971088 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074553 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074687 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.177878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178030 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178085 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281342 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281499 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: E0308 19:33:39.381784 4885 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.390570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.410341 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.436365 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.455658 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.477306 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.495404 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: E0308 19:33:39.503570 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.529407 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.552453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.569549 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.590320 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.610396 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.626445 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.646701 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.667043 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.698068 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.717756 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.742964 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367840 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367913 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368222 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368430 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368584 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368737 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.369826 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.066304 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.069806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d"} Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.070828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.083347 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.105567 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.128937 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.156768 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.173851 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.188623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.199767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.223630 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.237311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.248585 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.268179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.279369 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.292130 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.314053 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.335593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.350070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.365416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990803 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990848 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:41Z","lastTransitionTime":"2026-03-08T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.012557 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.018002 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.039275 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044574 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044695 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.065557 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.076992 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.077981 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082714 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" exitCode=1 Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082817 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.084162 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.084487 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.103826 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.108105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.110744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.111001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.111876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.112116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.112301 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.130463 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.142727 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.142882 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.162350 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.183025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.201123 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.216866 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.246258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.265852 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.282786 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.300648 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.321370 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.333328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.351330 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367500 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.367704 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367622 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.367853 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.368021 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.368128 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.371170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.398423 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.415765 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.438860 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.092070 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.098711 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:43 crc kubenswrapper[4885]: E0308 19:33:43.099028 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.131394 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.153405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.172680 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.191591 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.207447 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.230087 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.252033 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.272258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.290761 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.316579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.332417 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.352196 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.372710 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.411449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.430025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.446436 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.467260 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367333 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367401 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367483 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367477 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367635 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367717 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367794 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.505299 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:45 crc kubenswrapper[4885]: I0308 19:33:45.959783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:33:45 crc kubenswrapper[4885]: I0308 19:33:45.982379 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:45Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.002182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:45Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.023842 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.045086 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.066999 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.086200 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.131374 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.158212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.179619 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.201605 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.224234 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.241410 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.262871 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.284767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.315568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.337721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.362278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367443 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367483 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367391 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367729 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367820 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.368005 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367415 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367485 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.367628 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.367788 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.368037 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.368155 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.368307 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.402159 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.426101 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.444291 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.462837 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.479449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.498323 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: E0308 19:33:49.506112 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.525562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.542681 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.573307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.594553 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.618070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.633508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.653653 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.672797 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.693222 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.713156 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.732859 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367144 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367242 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367284 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.367514 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.367970 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.368233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.368465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:51 crc kubenswrapper[4885]: I0308 19:33:51.388431 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269245 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.290809 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296533 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296551 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.313540 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318667 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318773 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.336731 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342647 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.363584 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367053 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367221 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367457 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367516 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367872 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369392 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.383150 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.390568 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.390787 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.367849 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368079 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368174 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368179 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368381 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368500 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368596 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.507867 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368081 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368092 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368227 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368709 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:57 crc kubenswrapper[4885]: I0308 19:33:57.368656 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:57 crc kubenswrapper[4885]: E0308 19:33:57.368971 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367603 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.367798 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.368020 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.367961 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.368121 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.384673 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.404997 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.420727 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.440321 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.460562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.492033 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: E0308 19:33:59.508490 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.511806 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.543914 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.564807 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.585061 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.606064 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.639791 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.664671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.684531 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.703467 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.720002 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.743040 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.763168 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.780474 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367866 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367808 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367810 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368054 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368156 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368266 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368107 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368162 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368730 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368843 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398575 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.419649 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425259 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.445235 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450574 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.470757 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476141 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476161 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.496111 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501044 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501154 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.520071 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.520304 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.368469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.368711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368838 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.369002 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.369402 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.369539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.510578 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200292 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200384 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" exitCode=1 Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6"} Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.201131 4885 scope.go:117] "RemoveContainer" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.223127 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.243014 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.244897 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.244981 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.245017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.245067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245286 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245325 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245346 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245411 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.24539066 +0000 UTC m=+211.641444713 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245685 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245748 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.245734239 +0000 UTC m=+211.641788292 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245820 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245987 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.245952044 +0000 UTC m=+211.642006097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245990 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246093 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246343 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246458 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.246429998 +0000 UTC m=+211.642484051 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.269843 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.286732 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.307050 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.327899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.346047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.346242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346421 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.346381007 +0000 UTC m=+211.742435060 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346648 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346825 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.346784187 +0000 UTC m=+211.742838290 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.362149 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367213 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367359 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367553 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367721 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367969 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.368031 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.381832 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.405541 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.424179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.443452 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.465835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.487046 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.514195 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.535556 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.555047 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.567263 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.591254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.609500 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.208662 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.208752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d"} Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.232486 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.252610 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.270172 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.283312 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.315768 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.338694 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.354234 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.376887 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.395796 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.411708 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.437999 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.458445 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.482617 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.500965 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.523489 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.539723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.561210 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.581698 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.604085 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.367872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368010 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.368061 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369070 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369380 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.369607 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.220615 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.225088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000"} Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.225777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.245593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.268375 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.289646 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.308069 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.323668 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.361600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.384223 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.398883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.422208 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.437396 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.451307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.468459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.489995 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: E0308 19:34:09.511161 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.518819 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.539587 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.559637 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.575775 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.596758 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.614187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.639947 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.662958 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.679900 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.697595 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.712188 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.732472 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.754615 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.771904 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.791453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.805850 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.821594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.837564 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.857355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.875881 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.906508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.926034 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.944005 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.965418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.984102 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.232058 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.233475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237553 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" exitCode=1 Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237638 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000"} Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237703 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.239042 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.239569 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.261996 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.283613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.301612 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.323267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.356994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368036 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368048 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368124 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368369 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368604 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.377594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.403770 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.421718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.440675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.457524 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.478185 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.498041 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.520177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.540529 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.556663 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.588707 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.610505 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.628460 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.646059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.244425 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.249870 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:11 crc kubenswrapper[4885]: E0308 19:34:11.250191 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.270302 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.291603 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.314507 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.334054 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.354428 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.370524 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.402080 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.425484 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.441297 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.459880 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.480511 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.495723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.515720 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.535767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.566900 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.584408 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.607285 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.623917 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.641472 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.367528 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.367704 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.368229 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.368434 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.368529 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.368913 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.369014 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.369129 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.809380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810664 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.834379 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.841053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.841183 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.861257 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866602 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.886106 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891321 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.912295 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916957 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916977 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.936865 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.937135 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.367953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.368008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.367974 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.368076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368214 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368386 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368586 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368760 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.512976 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367589 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367622 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367710 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.367792 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.367948 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.368042 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.368194 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368030 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368051 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.368472 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.368746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368971 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.369046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.369108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.389693 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.405675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.424294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.444684 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.479067 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.500187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: E0308 19:34:19.514044 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.526373 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.542549 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.561555 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.579392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.598840 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.622422 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.643166 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.660721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.678544 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.712017 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.733972 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.753052 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.775477 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368071 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368199 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368390 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368682 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368809 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.369083 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367180 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367180 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.367765 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367330 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368224 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368392 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070033 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070182 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.091392 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096672 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096718 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096737 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.117793 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123569 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.145283 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151730 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.173745 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179293 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.211806 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.212059 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367371 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.367500 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.367644 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367686 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.368123 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.368267 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.368778 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.369110 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.516123 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367225 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367418 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367548 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367623 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367701 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367205 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367156 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367399 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367505 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367607 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367693 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.402307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.426546 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.448163 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.468899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.488913 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.510468 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: E0308 19:34:29.517352 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.545679 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.561624 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.578224 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.597759 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.614462 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.632675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.649666 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.675343 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.690340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.713257 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.735486 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.757212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.773502 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.367871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368019 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368312 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368288 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368441 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368822 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.367732 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.367902 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.368062 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.368192 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.589883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.589986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590032 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590049 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.613203 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619233 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.641334 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655213 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.678838 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.685024 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.708499 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714507 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.737203 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.737432 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.367567 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.367644 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.368153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368334 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.368396 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368541 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368601 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.519128 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:35 crc kubenswrapper[4885]: I0308 19:34:35.368914 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:35 crc kubenswrapper[4885]: E0308 19:34:35.369281 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367630 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.367707 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.367849 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.368096 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.368255 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367122 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367213 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367324 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367380 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367552 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.368154 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.393215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.413016 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.436582 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.449482 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.470603 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.487275 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.503865 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.517294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: E0308 19:34:39.520092 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.536897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.557125 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.575157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.590332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.605893 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.636276 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.655677 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.669099 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.688394 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.709325 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.723209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367748 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.367894 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368085 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368689 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368852 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.367216 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.367444 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.367739 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.368185 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.368208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368618 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368705 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034680 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:44Z","lastTransitionTime":"2026-03-08T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.111313 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh"] Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.111979 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.115645 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.115952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.116221 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.119127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186382 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ff7b4" podStartSLOduration=149.186344079 podStartE2EDuration="2m29.186344079s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.165147566 +0000 UTC m=+185.561201629" watchObservedRunningTime="2026-03-08 19:34:44.186344079 +0000 UTC m=+185.582398132" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186673 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186868 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186968 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.187030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.213186 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podStartSLOduration=149.213156729 podStartE2EDuration="2m29.213156729s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.212776949 +0000 UTC m=+185.608830982" watchObservedRunningTime="2026-03-08 19:34:44.213156729 +0000 UTC m=+185.609210782" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288235 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288338 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.289560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.296104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.327731 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=100.32770924 podStartE2EDuration="1m40.32770924s" podCreationTimestamp="2026-03-08 19:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.296560426 +0000 UTC m=+185.692614509" watchObservedRunningTime="2026-03-08 19:34:44.32770924 +0000 UTC m=+185.723763273" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.327998 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=97.327992738 podStartE2EDuration="1m37.327992738s" podCreationTimestamp="2026-03-08 19:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.32735488 +0000 UTC m=+185.723408903" watchObservedRunningTime="2026-03-08 19:34:44.327992738 +0000 UTC m=+185.724046771" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.334733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.361592 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.361573955 podStartE2EDuration="1m10.361573955s" podCreationTimestamp="2026-03-08 19:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.361101953 +0000 UTC m=+185.757155976" watchObservedRunningTime="2026-03-08 19:34:44.361573955 +0000 UTC m=+185.757627988" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.367999 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368198 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368345 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368387 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368654 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368832 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.396974 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.402497 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-57qch" podStartSLOduration=149.402470452 podStartE2EDuration="2m29.402470452s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.401114947 +0000 UTC m=+185.797168970" watchObservedRunningTime="2026-03-08 19:34:44.402470452 +0000 UTC m=+185.798524515" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.409223 4885 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.433257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: W0308 19:34:44.451812 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d4b007_1aec_465c_a5db_92efaa4defbe.slice/crio-819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e WatchSource:0}: Error finding container 819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e: Status 404 returned error can't find the container with id 819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.521345 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.533823 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podStartSLOduration=148.533789861 podStartE2EDuration="2m28.533789861s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.504156418 +0000 UTC m=+185.900210511" watchObservedRunningTime="2026-03-08 19:34:44.533789861 +0000 UTC m=+185.929843924" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.534264 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podStartSLOduration=149.534254823 podStartE2EDuration="2m29.534254823s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.534004457 +0000 UTC m=+185.930058520" watchObservedRunningTime="2026-03-08 19:34:44.534254823 +0000 UTC m=+185.930308886" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.552223 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=52.552183961 podStartE2EDuration="52.552183961s" podCreationTimestamp="2026-03-08 19:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.552020347 +0000 UTC m=+185.948074370" watchObservedRunningTime="2026-03-08 19:34:44.552183961 +0000 UTC m=+185.948238014" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.589901 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.589863025 podStartE2EDuration="53.589863025s" podCreationTimestamp="2026-03-08 19:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.572168583 +0000 UTC m=+185.968222646" watchObservedRunningTime="2026-03-08 19:34:44.589863025 +0000 UTC m=+185.985917078" Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.398204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" event={"ID":"07d4b007-1aec-465c-a5db-92efaa4defbe","Type":"ContainerStarted","Data":"6913dce9a030a733bc10cac09bffc5f9aa0b5fbf88ad608321a140780b687ab6"} Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.398325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" event={"ID":"07d4b007-1aec-465c-a5db-92efaa4defbe","Type":"ContainerStarted","Data":"819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e"} Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.418612 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5lms" podStartSLOduration=150.418583285 podStartE2EDuration="2m30.418583285s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.587821072 +0000 UTC m=+185.983875125" watchObservedRunningTime="2026-03-08 19:34:45.418583285 +0000 UTC m=+186.814637308" Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.420329 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" podStartSLOduration=150.4203144 podStartE2EDuration="2m30.4203144s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:45.418860142 +0000 UTC m=+186.814914215" watchObservedRunningTime="2026-03-08 19:34:45.4203144 +0000 UTC m=+186.816368453" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367383 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367597 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367764 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367816 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367902 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.368476 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:47 crc kubenswrapper[4885]: I0308 19:34:47.369233 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:47 crc kubenswrapper[4885]: E0308 19:34:47.369692 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367763 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.367990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368091 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368201 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368329 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:49 crc kubenswrapper[4885]: E0308 19:34:49.522375 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367576 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367602 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367639 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368403 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368750 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368630 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367367 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367447 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367572 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367856 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.368062 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.427745 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428791 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428893 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" exitCode=1 Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d"} Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.429079 4885 scope.go:117] "RemoveContainer" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.429667 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.429974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4)\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:34:53 crc kubenswrapper[4885]: I0308 19:34:53.433521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367555 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367570 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367570 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.367724 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.367817 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.368013 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.368153 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.523775 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367128 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367262 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367385 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367466 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367091 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.368821 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367700 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.367877 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368072 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368308 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368405 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:59 crc kubenswrapper[4885]: E0308 19:34:59.524694 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367053 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367100 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367275 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367305 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367798 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367541 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368037 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368646 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368770 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368906 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.369473 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.478364 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.483741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08"} Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.484799 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.570434 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podStartSLOduration=168.570395124 podStartE2EDuration="2m48.570395124s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:03.522614802 +0000 UTC m=+204.918668855" watchObservedRunningTime="2026-03-08 19:35:03.570395124 +0000 UTC m=+204.966449177" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.571765 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jps4r"] Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.571962 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:03 crc kubenswrapper[4885]: E0308 19:35:03.572161 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.367712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.367873 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.367907 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.368014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.368183 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.368283 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.525890 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:35:05 crc kubenswrapper[4885]: I0308 19:35:05.367589 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:05 crc kubenswrapper[4885]: E0308 19:35:05.367816 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.367996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.368034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.368006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368170 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368277 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368765 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:07 crc kubenswrapper[4885]: I0308 19:35:07.367296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:07 crc kubenswrapper[4885]: I0308 19:35:07.367859 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:35:07 crc kubenswrapper[4885]: E0308 19:35:07.367884 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.367803 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.367868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.368518 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.368037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.368739 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.369012 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.508544 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.508661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584"} Mar 08 19:35:09 crc kubenswrapper[4885]: I0308 19:35:09.367538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:09 crc kubenswrapper[4885]: E0308 19:35:09.370423 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.297814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.297888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.298002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.298094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298121 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298185 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298205 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298211 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298273 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298288 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298262533 +0000 UTC m=+333.694316596 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298397 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298366025 +0000 UTC m=+333.694420088 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298430 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298413187 +0000 UTC m=+333.694467400 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298454 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298473 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298489 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.29853396 +0000 UTC m=+333.694587983 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.372776 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.373379 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.373979 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.374323 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.399110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.399345 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.399305181 +0000 UTC m=+333.795359234 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.399854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.400116 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.400311 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.400294068 +0000 UTC m=+333.796348121 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.367837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.370536 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.371132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.798523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.849148 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.849815 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.850408 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.851147 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.854476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.855702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.857996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.858783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.861592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.863370 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.866421 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.867517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.869553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.867527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.873605 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.874440 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.883144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884179 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884246 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884387 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884461 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884701 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885078 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885259 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885627 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.887895 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.889677 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890182 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890424 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890644 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890708 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.891091 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.892349 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.893282 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.893404 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894238 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894482 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894784 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895307 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895829 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.896451 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.907591 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.907841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908071 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908432 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908531 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908631 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908727 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908817 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908908 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.909097 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.910407 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.910690 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911496 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911905 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912367 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912470 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912682 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912904 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913052 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912995 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.914245 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.917792 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.919130 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.924412 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.924529 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925064 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925344 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.926155 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.926816 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.934622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.935382 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.935749 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936130 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936861 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937177 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937536 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937789 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937958 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937994 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944229 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944259 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944675 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944799 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944913 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945145 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945245 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945563 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945787 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.946114 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.946497 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947818 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947824 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.948261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.948946 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949399 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949528 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949563 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949664 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949759 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950572 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950588 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950736 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953120 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953947 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.961580 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lvfcn"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.969056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.970157 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.975830 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.976354 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.980227 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.982530 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.983184 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.983732 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.984251 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.985100 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.985685 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.986459 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.986689 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.990293 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991575 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991797 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991877 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992051 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992591 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.993165 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.000049 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.002984 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.004730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010375 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010584 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010942 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.011266 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.011466 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012241 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012448 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.016114 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.017178 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.019385 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.019460 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.020424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.034248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.035577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.035633 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036208 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036579 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036847 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037156 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037603 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.038102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.038179 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.041853 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.044788 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.045229 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.045689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.051996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072886 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072968 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072987 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073126 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073372 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073484 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073651 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073671 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073729 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074012 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074062 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074135 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.075739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.076524 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.077465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.078249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.078638 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.079288 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.080294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.080414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.081234 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.082969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.096351 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.096973 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.102283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.104061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.104132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106583 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106611 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106624 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106635 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106656 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106667 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.107405 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.108237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.108984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.110312 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.111276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.114268 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.114341 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117079 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117108 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117329 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.118349 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.118379 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.119819 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bl88k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.119959 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.120545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.120906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.122141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.123262 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.124567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.125554 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.126813 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.127835 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.129086 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.130378 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.131512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.132498 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.132673 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.134035 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.135111 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.136560 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.137692 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.138752 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.139769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.140784 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.141755 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.142866 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.144230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.145489 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.146576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.147529 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.148151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.148529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.153073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.173132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174727 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174940 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175074 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175170 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175633 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175668 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175701 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175765 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175910 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176617 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.177405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178226 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178648 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178665 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179085 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179179 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.181178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.181207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182868 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183075 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183981 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.185269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.192388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.192823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.193734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.194432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.196147 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.202267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.212569 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.232718 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.253381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.292673 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.313227 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.332698 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.353717 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.374138 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.393115 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.416667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.432610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.451987 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.474117 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.492481 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.525510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.533046 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.552905 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.572975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.602249 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.612830 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.632857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.653491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.672997 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.694310 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.712382 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.733812 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.754449 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.773890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.793682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.813293 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.833436 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.853346 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.873359 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.913210 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.933127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.954655 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.972996 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.986991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987092 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987217 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987236 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: E0308 19:35:15.987489 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.487478732 +0000 UTC m=+217.883532755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987479 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987555 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987660 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.994968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.013629 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.034316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.051610 4885 request.go:700] Waited for 1.014463904s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.053972 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.075100 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.088817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.089063 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.589019804 +0000 UTC m=+217.985073867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089305 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090697 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091101 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.093192 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.593166465 +0000 UTC m=+217.989220528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094259 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094444 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094538 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094805 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094899 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094983 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.095778 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.095899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.096031 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097409 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098520 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098795 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099054 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099673 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101439 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103026 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104074 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104134 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104361 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105282 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105761 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106094 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106191 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.107068 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.110721 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.111698 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.114337 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.134543 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.153006 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.174180 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.193636 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207243 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.207324 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.707286285 +0000 UTC m=+218.103340348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207477 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207955 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207996 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208510 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208548 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209185 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209753 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210031 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210195 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210602 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210657 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210898 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211024 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211289 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211318 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211355 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211420 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211467 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211530 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212464 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212840 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214089 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.214096 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.714070726 +0000 UTC m=+218.110124789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214822 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.217096 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.219393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.219711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.220790 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.220987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.221240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222139 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222324 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222835 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.223563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.225071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226194 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226216 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.227212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.227653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.228642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.229067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.229595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.231176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.233631 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.241709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.246064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.253201 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.260361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.272889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.293428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.307346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.313570 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.316200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.317429 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.817392846 +0000 UTC m=+218.213446879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.317568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.318143 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.818118545 +0000 UTC m=+218.214172608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.326836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.350734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.372571 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.373098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.392812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.395349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.405192 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.406980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.414038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.419467 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.419834 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.919803021 +0000 UTC m=+218.315857104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.420175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.421073 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.921036484 +0000 UTC m=+218.317090587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.433130 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.440126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.454711 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.473257 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.490818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.494015 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.513217 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.524317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.524558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.024516858 +0000 UTC m=+218.420570921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.525465 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.526013 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.025996288 +0000 UTC m=+218.422050351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.538083 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.544982 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.554087 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.566109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.573754 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.594521 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.612885 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.627446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.628657 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.128484585 +0000 UTC m=+218.524538648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.629176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.630101 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.130078007 +0000 UTC m=+218.526132060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.634404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.641262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.662706 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.671008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.673329 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.693253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.700606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.701335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: W0308 19:35:16.703159 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175c50f5_857d_4697_bcde_2ce47f2edfc5.slice/crio-03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c WatchSource:0}: Error finding container 03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c: Status 404 returned error can't find the container with id 03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.704500 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:16 crc kubenswrapper[4885]: W0308 19:35:16.712284 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7583a8_a980_4ab2_a594_bf55ec72c91c.slice/crio-4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39 WatchSource:0}: Error finding container 4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39: Status 404 returned error can't find the container with id 4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39 Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.713540 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.734432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.734695 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.234642261 +0000 UTC m=+218.630696284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.735798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.738082 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.238057703 +0000 UTC m=+218.634111796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.738121 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.753216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.773724 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.784116 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.793150 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.813694 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.825016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.833875 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.841574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.841717 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.341683851 +0000 UTC m=+218.737737874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.842471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.842836 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.34282802 +0000 UTC m=+218.738882043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.854103 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.872801 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.894166 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.913118 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.926074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.933466 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.939331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.944081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.944267 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.44424822 +0000 UTC m=+218.840302243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.944513 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.945011 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.445000899 +0000 UTC m=+218.841054922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.953463 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.972872 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.984591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.993508 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.013553 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.047201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.047365 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.547328092 +0000 UTC m=+218.943382155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.048322 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.049260 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.549170362 +0000 UTC m=+218.945224395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.058295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.071860 4885 request.go:700] Waited for 1.895591995s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.077080 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.097904 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.109358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.119840 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.131042 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.137239 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.152383 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.152620 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.652581845 +0000 UTC m=+219.048635898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.153193 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.153812 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.653790477 +0000 UTC m=+219.049844530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.178838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.187878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.206883 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.210624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.213855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.231388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.241829 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.254557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.256304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.257066 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.757043844 +0000 UTC m=+219.153097877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.276260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.302277 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.316775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.330702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.331735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.349041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.350691 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.360266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.360704 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.860690293 +0000 UTC m=+219.256744326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.367988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.371902 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.398304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.411090 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.412077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.413889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.427211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.448196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.460835 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.461013 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.960988012 +0000 UTC m=+219.357042035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.461173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.461558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.961548826 +0000 UTC m=+219.357602849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.471131 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.475539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.484108 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.503101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.503538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.504667 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.514860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.518885 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbb4f97_c9c8_43ef_a4b1_06dea8d6d8b9.slice/crio-3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59 WatchSource:0}: Error finding container 3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59: Status 404 returned error can't find the container with id 3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59 Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.523119 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.534276 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac2fbf9_c9bb_4ef8_988f_4407e688ad54.slice/crio-1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d WatchSource:0}: Error finding container 1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d: Status 404 returned error can't find the container with id 1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.538463 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.543442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.552481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.553405 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.558750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.560394 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.562523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.563563 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.563592 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.063571531 +0000 UTC m=+219.459625554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.563807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"dca6385e3ff6d75cd66d79ce65f428f2299199dccdcceb22795c5b4d923f14af"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"7c53dc93eac2dc1186590567f3f6247b507dc5b2be999e78057b8c12727c7ec5"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"8442bacc49b0d28ad2d61c70a11b459dc87988fb6032f517c9384c95838ab3f8"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.576711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerStarted","Data":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577051 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerStarted","Data":"4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577225 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.578649 4885 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6lcrf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.578688 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.581528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" event={"ID":"df78e76d-5024-4d31-a0b9-17d0d6c6c258","Type":"ContainerStarted","Data":"888bff3f11c36a2990ba40c2b1027e5a0de19de32078e51bf9bb026c6597982a"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.584022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerStarted","Data":"3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.590431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.608996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.616032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2b7w" event={"ID":"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54","Type":"ContainerStarted","Data":"1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.618612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.629844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.632999 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.649471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.651227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.657350 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.671538 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.673241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.673603 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.173586451 +0000 UTC m=+219.569640474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.693100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.693598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.694578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.698770 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.706534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.711552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.721313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.734263 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.747664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.756196 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.757100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.761624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58e5e9a_de88_4209_8100_e9d4e415e68d.slice/crio-287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91 WatchSource:0}: Error finding container 287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91: Status 404 returned error can't find the container with id 287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91 Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.768017 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.776698 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.777101 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.277083136 +0000 UTC m=+219.673137159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.798696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.821567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.830284 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.831950 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.832637 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.840535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.840739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.851165 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.869099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.870003 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.876055 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.877958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.878321 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.378310079 +0000 UTC m=+219.774364102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.905135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.929664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.930143 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.931545 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.934965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.937895 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.944418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.963070 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.968763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.969686 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.973695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.978188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.979464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.979648 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.479621074 +0000 UTC m=+219.875675097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.980105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.980416 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.480403815 +0000 UTC m=+219.876457838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.982749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.995264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.001208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.015176 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.035202 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.043188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.051790 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.070251 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.074231 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.081419 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.081561 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.581540296 +0000 UTC m=+219.977594309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.081726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.082111 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.582096762 +0000 UTC m=+219.978150855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: W0308 19:35:18.168142 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a6bad6_cd1e_4e38_88fe_d531ea458683.slice/crio-2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a WatchSource:0}: Error finding container 2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a: Status 404 returned error can't find the container with id 2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.182644 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.182964 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.682812811 +0000 UTC m=+220.078866834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.183298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.183621 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.683610153 +0000 UTC m=+220.079664176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.284312 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.284673 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.784658091 +0000 UTC m=+220.180712114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.385871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.386224 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.886213144 +0000 UTC m=+220.282267167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.487713 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.488065 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.988036333 +0000 UTC m=+220.384090346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.488250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.488652 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.98864458 +0000 UTC m=+220.384698603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.589098 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.589461 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.089446112 +0000 UTC m=+220.485500135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.683269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerStarted","Data":"9be25d7733e293f02f15db81210b4d6937d933a0f0c84184e04a7e5621c56df8"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.700957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.701374 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.201360012 +0000 UTC m=+220.597414035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.721073 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.750236 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podStartSLOduration=183.750220312 podStartE2EDuration="3m3.750220312s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:18.749206184 +0000 UTC m=+220.145260207" watchObservedRunningTime="2026-03-08 19:35:18.750220312 +0000 UTC m=+220.146274335" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.755793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.762029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.762054 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.789626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.789669 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.795248 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" event={"ID":"9878f6fd-fc1f-4980-a687-84478d0b92c1","Type":"ContainerStarted","Data":"91d1f1a1e8874b5d7a868e5fe1a550f38d03387f0601c1f60d2396a313caae96"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.800268 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"6da66b4547f5616c03795376a3643219df504dc7a6844e33d368b86daeb04269"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.801825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.803940 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.303904191 +0000 UTC m=+220.699958214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.815285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerStarted","Data":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.823325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"a75c342d6c7128f5cfc7ba5e493b603a907baec57caa4dec6a6301b3aa095a95"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.849701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" event={"ID":"df78e76d-5024-4d31-a0b9-17d0d6c6c258","Type":"ContainerStarted","Data":"726d5822e3dcd21bba0f1b655f119ee9a19a61beb8108cae545b921c5f421cd3"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.854646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" event={"ID":"a93ee425-a2b2-492c-bafc-2443d2fde2d4","Type":"ContainerStarted","Data":"af8ada1c83d377a083cd80c83ba67ca575cd054d01c6637a2e87cb70a37f07ea"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.867099 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"6ebe829d87ca90bab0852d90ee536bd6df6c7241943f58baf5f947d0c1a63cfd"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.875524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerStarted","Data":"5a1fd3ef96a73a5fc6f9284be43b3b9835d635e847a666b7836e4da30366a98d"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.891303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2b7w" event={"ID":"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54","Type":"ContainerStarted","Data":"5c7f415af0a310e0e5ec815fb8b99f9af5a695746f634a306a170ddae1dc5ab1"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.893218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:18 crc kubenswrapper[4885]: W0308 19:35:18.894990 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca80eb80_6964_436a_bf66_0c5fe9b7e641.slice/crio-e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132 WatchSource:0}: Error finding container e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132: Status 404 returned error can't find the container with id e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132 Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.896080 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" event={"ID":"f2a6bad6-cd1e-4e38-88fe-d531ea458683","Type":"ContainerStarted","Data":"2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.904870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.907337 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.407322593 +0000 UTC m=+220.803376616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.922387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" event={"ID":"7ce8edea-f754-4ee2-a475-2022f99ed7f9","Type":"ContainerStarted","Data":"88f02e527e05546477939405952a27f505b7ed5c0e9bb3144acc4432372dbfac"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.932476 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerStarted","Data":"40c776312848989ccb2eddcff1adbedd8af2200ce48ffdc1e4635ef09792719e"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.932806 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.943585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" event={"ID":"6396835c-4d1e-4b5d-a6f1-4f8003f073e9","Type":"ContainerStarted","Data":"2f12d83b6e520da0ff0a1f7171d278c8ae9e7493aa1a2f58395b447dfa4e0de4"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.944178 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.946374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvfcn" event={"ID":"e58e5e9a-de88-4209-8100-e9d4e415e68d","Type":"ContainerStarted","Data":"287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.950892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"9e732dd141ffc6d56acf5e0362491a88c049308d550bfef78ae76c517eb37aa4"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.985653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" podStartSLOduration=182.985635012 podStartE2EDuration="3m2.985635012s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:18.984229235 +0000 UTC m=+220.380283278" watchObservedRunningTime="2026-03-08 19:35:18.985635012 +0000 UTC m=+220.381689035" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.005627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.007640 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.507618992 +0000 UTC m=+220.903673005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054462 4885 patch_prober.go:28] interesting pod/console-operator-58897d9998-2tz9t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054509 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podUID="6396835c-4d1e-4b5d-a6f1-4f8003f073e9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054787 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054841 4885 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bg5wl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054850 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054863 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.063588 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.110139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.113682 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.613659634 +0000 UTC m=+221.009713657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.214279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.215322 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.715305349 +0000 UTC m=+221.111359372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.316584 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.316860 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.816848511 +0000 UTC m=+221.212902534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.417562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.417932 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.91790127 +0000 UTC m=+221.313955293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.519118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.519769 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.01975295 +0000 UTC m=+221.415806973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.541709 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.554271 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.566815 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.591520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.599880 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.599938 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.623959 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.624405 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.124390525 +0000 UTC m=+221.520444538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.683438 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podStartSLOduration=183.683405317 podStartE2EDuration="3m3.683405317s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.680847268 +0000 UTC m=+221.076901291" watchObservedRunningTime="2026-03-08 19:35:19.683405317 +0000 UTC m=+221.079459340" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.715427 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t2b7w" podStartSLOduration=184.715407805 podStartE2EDuration="3m4.715407805s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.714788809 +0000 UTC m=+221.110842832" watchObservedRunningTime="2026-03-08 19:35:19.715407805 +0000 UTC m=+221.111461828" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.725016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.725294 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.22528296 +0000 UTC m=+221.621336983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.728196 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.730606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.782089 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hsdmw" podStartSLOduration=184.782067212 podStartE2EDuration="3m4.782067212s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.761027788 +0000 UTC m=+221.157081811" watchObservedRunningTime="2026-03-08 19:35:19.782067212 +0000 UTC m=+221.178121225" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.826528 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.827961 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.32790195 +0000 UTC m=+221.723955973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.828525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.858440 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podStartSLOduration=184.858423879 podStartE2EDuration="3m4.858423879s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.856087877 +0000 UTC m=+221.252141890" watchObservedRunningTime="2026-03-08 19:35:19.858423879 +0000 UTC m=+221.254477902" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.875299 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.884703 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lvfcn" podStartSLOduration=183.884688673 podStartE2EDuration="3m3.884688673s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.881734474 +0000 UTC m=+221.277788497" watchObservedRunningTime="2026-03-08 19:35:19.884688673 +0000 UTC m=+221.280742696" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.912561 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" podStartSLOduration=184.91254454 podStartE2EDuration="3m4.91254454s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.91034708 +0000 UTC m=+221.306401103" watchObservedRunningTime="2026-03-08 19:35:19.91254454 +0000 UTC m=+221.308598563" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.929269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.929752 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.42973612 +0000 UTC m=+221.825790143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.961510 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:19 crc kubenswrapper[4885]: W0308 19:35:19.967509 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc52227b_0572_4fed_a5c1_e86521a20e58.slice/crio-d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61 WatchSource:0}: Error finding container d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61: Status 404 returned error can't find the container with id d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61 Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.970533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.010600 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.022203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerStarted","Data":"134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.030946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.031308 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.531285723 +0000 UTC m=+221.927339746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.032230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"0f8119b0d68e82807f3907440f0ab6a1dd0c951f0bc34c5ea8883d882aa8768e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.035472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"24fa952afc1cee799dbe57d90ca6a31ef015131f7bcb28a28e0cae5629e156af"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.044124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerStarted","Data":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.053811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"822026861bd7bd3eb3b4abc852e4a8b2ee41fbc101e83faae4ea4adda87b4aba"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.058191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" event={"ID":"fe3a8c81-8c1d-4b38-9cae-813fb749fd43","Type":"ContainerStarted","Data":"22c1696bfe5d42dda6bd4ec2ff027c09be34e25612f324559a931a48e4fe7037"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.066760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" event={"ID":"4928f728-c20b-4d8e-83f3-786cf90cf3e6","Type":"ContainerStarted","Data":"5c303026128bc9a23d184cd4393e6b606778e0a778c0e10cfe8f03f268ec44d2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.084754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" event={"ID":"7ce8edea-f754-4ee2-a475-2022f99ed7f9","Type":"ContainerStarted","Data":"f66c621893c1804a2ce9dd2c9e30230e8c52596e34c2e42465a9285e6768bcf2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.086245 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" event={"ID":"9878f6fd-fc1f-4980-a687-84478d0b92c1","Type":"ContainerStarted","Data":"e0cadddc4c54fc080453871aee1452224b0f7cf0ac2e114f1399e3a163a6efb0"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.092151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" event={"ID":"6396835c-4d1e-4b5d-a6f1-4f8003f073e9","Type":"ContainerStarted","Data":"9ca9706ffcf72c3c73f90c5b8b579cbad436f32afa85e40bad562c9ecb6198c7"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.093195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" event={"ID":"5664b98a-83b1-433d-8449-04a982f77fff","Type":"ContainerStarted","Data":"5f22ebcb294fc868cd9864233184817834561a14db9eecdc3410944896bd4b06"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.094120 4885 patch_prober.go:28] interesting pod/console-operator-58897d9998-2tz9t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.094169 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podUID="6396835c-4d1e-4b5d-a6f1-4f8003f073e9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.095172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" event={"ID":"494bb437-45dd-48e3-b932-9c3645e493ef","Type":"ContainerStarted","Data":"f659fc375e81a0da3b3a8a5764dac9643b37e01b75000e4baf0c2c826224551b"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.103263 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.105263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"cfbedb223e1048892c1331c9f68c07cd8c7ed4be2b1b10fb1fd66bb675ec0496"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.116056 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.126147 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.136869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.138270 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.63825871 +0000 UTC m=+222.034312733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.144283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bl88k" event={"ID":"ca80eb80-6964-436a-bf66-0c5fe9b7e641","Type":"ContainerStarted","Data":"71a55f6115fed54a842d0d8a7fe91a125603a769193f94ec11703612038a1772"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.144318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bl88k" event={"ID":"ca80eb80-6964-436a-bf66-0c5fe9b7e641","Type":"ContainerStarted","Data":"e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.154550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.157596 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" podStartSLOduration=184.157580788 podStartE2EDuration="3m4.157580788s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.154887826 +0000 UTC m=+221.550941859" watchObservedRunningTime="2026-03-08 19:35:20.157580788 +0000 UTC m=+221.553634801" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.164097 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.186412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"5984b65b91ed88837ac3d747bb2d9735e94039292fdd6931b4edb014e97f896d"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.186452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"7e0d027df693db86a17d40da3dc20a54596865a69eda634418b82dad66b0d64b"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.189416 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" podStartSLOduration=184.189406452 podStartE2EDuration="3m4.189406452s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.185527768 +0000 UTC m=+221.581581791" watchObservedRunningTime="2026-03-08 19:35:20.189406452 +0000 UTC m=+221.585460475" Mar 08 19:35:20 crc kubenswrapper[4885]: W0308 19:35:20.200505 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8bd61f_4965_4410_9ec7_b858a4529287.slice/crio-5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8 WatchSource:0}: Error finding container 5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8: Status 404 returned error can't find the container with id 5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.204746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"88c0b1af50a552d08be715255e8619226fa4c9e630976040932d73e5d1b633eb"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.228068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.236979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"c605a611d56f05303c2dcd605e19bb8a2402ffc1ea2399ec11f51363586ef9b5"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.238147 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" podStartSLOduration=185.238136697 podStartE2EDuration="3m5.238136697s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.237321726 +0000 UTC m=+221.633375749" watchObservedRunningTime="2026-03-08 19:35:20.238136697 +0000 UTC m=+221.634190720" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.239393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.240393 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.740371718 +0000 UTC m=+222.136425741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.276277 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" podStartSLOduration=184.271412039 podStartE2EDuration="3m4.271412039s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.269652713 +0000 UTC m=+221.665706736" watchObservedRunningTime="2026-03-08 19:35:20.271412039 +0000 UTC m=+221.667466062" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.281856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.290220 4885 generic.go:334] "Generic (PLEG): container finished" podID="f5b425d2-db8e-45f3-a141-8ac7bd678491" containerID="cbe27be8b7dab43238c6046bdd1ad6b778c046687071c13fbcbbacadf7286eb5" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.291793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerDied","Data":"cbe27be8b7dab43238c6046bdd1ad6b778c046687071c13fbcbbacadf7286eb5"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.297809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" event={"ID":"f2a6bad6-cd1e-4e38-88fe-d531ea458683","Type":"ContainerStarted","Data":"9600dc055804a3c36d351eb86512c92ddfcc39baafd43cbd5a93f262b40e898e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.321289 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" podStartSLOduration=185.321267856 podStartE2EDuration="3m5.321267856s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.320044504 +0000 UTC m=+221.716098527" watchObservedRunningTime="2026-03-08 19:35:20.321267856 +0000 UTC m=+221.717321879" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.341837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"50bf4da5bc456db0d20e51c442b88818b7facd5dec864ab7655b9c4bcb8bd792"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.342118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.342895 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.842877445 +0000 UTC m=+222.238931468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.372970 4885 generic.go:334] "Generic (PLEG): container finished" podID="5a244e04-1aec-4355-89c5-794667b5969f" containerID="cd28bccff11894cfc898016d0e9636b64db0338a86899e48f70c2cc6d9936340" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.373246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerDied","Data":"cd28bccff11894cfc898016d0e9636b64db0338a86899e48f70c2cc6d9936340"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.379381 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.381830 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bl88k" podStartSLOduration=6.381810569 podStartE2EDuration="6.381810569s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.360438086 +0000 UTC m=+221.756492109" watchObservedRunningTime="2026-03-08 19:35:20.381810569 +0000 UTC m=+221.777864592" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.405241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" podStartSLOduration=185.405220847 podStartE2EDuration="3m5.405220847s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.400901461 +0000 UTC m=+221.796955484" watchObservedRunningTime="2026-03-08 19:35:20.405220847 +0000 UTC m=+221.801274870" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.382257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" event={"ID":"a554818d-91a7-48e1-a5a7-5808a5240f3e","Type":"ContainerStarted","Data":"5c6b2740317e9c99c70a069e1107a4823751c8d4119f1015bb56c43ea6ad6dba"} Mar 08 19:35:20 crc kubenswrapper[4885]: W0308 19:35:20.399668 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f932056_01e3_43aa_a91a_7f33d20445ba.slice/crio-54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d WatchSource:0}: Error finding container 54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d: Status 404 returned error can't find the container with id 54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.419851 4885 generic.go:334] "Generic (PLEG): container finished" podID="ace2a8fd-20b4-40b6-a2ce-3e34454b3c71" containerID="5726d4714ea936e1fe74b51d8e22ca7342cc7250bda63a0d262e70977c5d6314" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.427304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvfcn" event={"ID":"e58e5e9a-de88-4209-8100-e9d4e415e68d","Type":"ContainerStarted","Data":"358af39687fbb3a6287d37b8924ed64ff5df895ca495d93d9dea0acb3df66e4c"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.427415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerDied","Data":"5726d4714ea936e1fe74b51d8e22ca7342cc7250bda63a0d262e70977c5d6314"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.445464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.446899 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.946884213 +0000 UTC m=+222.342938236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.466546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" event={"ID":"9930a19e-2aa9-42ec-91fc-16cd50bc2f40","Type":"ContainerStarted","Data":"bd37791fc087646bf00840284fb7bb455597a0b50d52b33de9e0532834430bed"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.495049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerStarted","Data":"f8be22b6ff0acffcb9ccf46c1c57e368d284120713d460abb8b81012323f6dcf"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.495089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.496812 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.496870 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.500494 4885 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2bp7t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.500558 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.548169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.551553 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.051541609 +0000 UTC m=+222.447595632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.575034 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:20 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:20 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:20 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.575252 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.650484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.650836 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.15082219 +0000 UTC m=+222.546876213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.654959 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podStartSLOduration=185.65490339 podStartE2EDuration="3m5.65490339s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.602050133 +0000 UTC m=+221.998104156" watchObservedRunningTime="2026-03-08 19:35:20.65490339 +0000 UTC m=+222.050957413" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.655676 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" podStartSLOduration=184.655669561 podStartE2EDuration="3m4.655669561s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.651079727 +0000 UTC m=+222.047133750" watchObservedRunningTime="2026-03-08 19:35:20.655669561 +0000 UTC m=+222.051723594" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.752691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.753218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.253205855 +0000 UTC m=+222.649259878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.853675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.854070 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.354056118 +0000 UTC m=+222.750110141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.956527 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.956845 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.456834094 +0000 UTC m=+222.852888117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.059361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.060033 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.560004689 +0000 UTC m=+222.956058712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.164859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.165349 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.665335313 +0000 UTC m=+223.061389336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.268308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.268479 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.768442307 +0000 UTC m=+223.164496330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.269514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.269864 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.769850024 +0000 UTC m=+223.165904047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.364902 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59112: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.383306 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.383843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.883828 +0000 UTC m=+223.279882013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.461145 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59116: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.484997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.485321 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.98530997 +0000 UTC m=+223.381363993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.528035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"2afac84c9994094dbcc536c4992e520d150cafe6ff99582d069b7e57a065ee18"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.528078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"98b356fec6f5a3b07b7f8d550069bc888524e61649dfe9e4f3646cfc3746121b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.556771 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" podStartSLOduration=186.556756515 podStartE2EDuration="3m6.556756515s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.554793143 +0000 UTC m=+222.950847166" watchObservedRunningTime="2026-03-08 19:35:21.556756515 +0000 UTC m=+222.952810538" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.561692 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:21 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:21 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:21 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.561744 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.574205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"ae3259feb4e05dab34376546fa71e134c7db4dd16752a41734b7be4914fcbf6e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.586247 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.587016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"a018a113bf2c34ad790622b7e6a21b7df550841ddadd4ac0190d2fb27bfb0729"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.587058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8"} Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.587161 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.08714484 +0000 UTC m=+223.483198863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.589369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" event={"ID":"fc52227b-0572-4fed-a5c1-e86521a20e58","Type":"ContainerStarted","Data":"d2c8fcad6617015b58eacd10f15825587255865b024a8bd694205a7bb7310736"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.589406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" event={"ID":"fc52227b-0572-4fed-a5c1-e86521a20e58","Type":"ContainerStarted","Data":"d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.592760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" event={"ID":"494bb437-45dd-48e3-b932-9c3645e493ef","Type":"ContainerStarted","Data":"061e204cf4b604d0aff77e0470b12de1bd86fd4d554678c8a8f235c4aec2236b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.593188 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.605211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"0236a696a6278a43e30ab9924a31d19da4cc309a43f9302effc299193e780900"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.611473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" event={"ID":"a93ee425-a2b2-492c-bafc-2443d2fde2d4","Type":"ContainerStarted","Data":"8a7d4b715db9b88eb0212a5cb14539c74237db982628ac0efce6aff9d1e1558a"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.617227 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59118: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.619409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" event={"ID":"82ad04de-932b-4ebf-97cf-0a6344ee1a9e","Type":"ContainerStarted","Data":"78e9a56d72798428d2805265d6ca05b9c1b61dddabc9930818607bcb302bae04"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.619453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" event={"ID":"82ad04de-932b-4ebf-97cf-0a6344ee1a9e","Type":"ContainerStarted","Data":"8e098fe83bb84f9477f906d20387594d444eacdb41e849a6af7be74a7aef91f8"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.620308 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.621506 4885 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m7ttr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.621565 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" podUID="82ad04de-932b-4ebf-97cf-0a6344ee1a9e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"7610278a1d339674636069d701d3a641d470b7cda8e0f49ee642589c394b1a09"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"75d8a238176218ff48dca8d9d8d7315e732593335742237740bd86a1389d3f47"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"a82a9753b5a0ab0015b44e21844d5750af26eb5e84e8dedb6e6fa09124c3d003"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.653733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" event={"ID":"4928f728-c20b-4d8e-83f3-786cf90cf3e6","Type":"ContainerStarted","Data":"251fc07b69057bce26461a677acd5d783780fe5344eef1cc3d7eeec2e562456b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.659662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerStarted","Data":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661308 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" podStartSLOduration=185.661293468 podStartE2EDuration="3m5.661293468s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.631663974 +0000 UTC m=+223.027717997" watchObservedRunningTime="2026-03-08 19:35:21.661293468 +0000 UTC m=+223.057347491" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661760 4885 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2bp7t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661813 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.663718 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" podStartSLOduration=185.663713303 podStartE2EDuration="3m5.663713303s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.661169745 +0000 UTC m=+223.057223768" watchObservedRunningTime="2026-03-08 19:35:21.663713303 +0000 UTC m=+223.059767326" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.687847 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.689861 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.189843393 +0000 UTC m=+223.585897416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.700229 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" event={"ID":"a554818d-91a7-48e1-a5a7-5808a5240f3e","Type":"ContainerStarted","Data":"7a4e0d8b5dd04f3c2b9599f08805d9622c4ab1441d28e223e4066a951d90c221"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.736805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" event={"ID":"5664b98a-83b1-433d-8449-04a982f77fff","Type":"ContainerStarted","Data":"addab8fadc44035b194d48f810c6120eb7ce9d3c5ffdb8a3fb6c3d2b4394803e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.738188 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" podStartSLOduration=186.738178329 podStartE2EDuration="3m6.738178329s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.694026855 +0000 UTC m=+223.090080878" watchObservedRunningTime="2026-03-08 19:35:21.738178329 +0000 UTC m=+223.134232352" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.738481 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" podStartSLOduration=185.738476227 podStartE2EDuration="3m5.738476227s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.738012995 +0000 UTC m=+223.134067018" watchObservedRunningTime="2026-03-08 19:35:21.738476227 +0000 UTC m=+223.134530250" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.755191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerStarted","Data":"37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.769193 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59126: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.779711 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" podStartSLOduration=185.779696612 podStartE2EDuration="3m5.779696612s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.779355173 +0000 UTC m=+223.175409196" watchObservedRunningTime="2026-03-08 19:35:21.779696612 +0000 UTC m=+223.175750635" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.788908 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.790268 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.290253315 +0000 UTC m=+223.686307338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.802021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" event={"ID":"9930a19e-2aa9-42ec-91fc-16cd50bc2f40","Type":"ContainerStarted","Data":"b68d8855c3d14efa923fb60bf2796a36fab89cd8d899314e2bd6c4567ba8f37c"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.819447 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" podStartSLOduration=185.819429147 podStartE2EDuration="3m5.819429147s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.807298952 +0000 UTC m=+223.203352975" watchObservedRunningTime="2026-03-08 19:35:21.819429147 +0000 UTC m=+223.215483170" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.837902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"c7312ccdce297bf1cb5832f17d211b176ed0a46e5e0bf56fb40e793e79e8e0bf"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.851106 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" podStartSLOduration=186.851087305 podStartE2EDuration="3m6.851087305s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.849044801 +0000 UTC m=+223.245098824" watchObservedRunningTime="2026-03-08 19:35:21.851087305 +0000 UTC m=+223.247141328" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"fc5985b1de4e52486230835005129f42595df4bf6093a086215a241cef3953f4"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858221 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"3fbd91dc6c3731018331d8714c4d928bf1e958d81c1651a9ed465e3be3a68538"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858347 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59140: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858789 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.887213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"76bf8744bcfc46e603b924837b1d9d82677d928c50703a871352c5035b8bfe75"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.896859 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" podStartSLOduration=185.896844002 podStartE2EDuration="3m5.896844002s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.896323249 +0000 UTC m=+223.292377272" watchObservedRunningTime="2026-03-08 19:35:21.896844002 +0000 UTC m=+223.292898025" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.897779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.899164 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" podStartSLOduration=185.899155285 podStartE2EDuration="3m5.899155285s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.874952935 +0000 UTC m=+223.271006958" watchObservedRunningTime="2026-03-08 19:35:21.899155285 +0000 UTC m=+223.295209308" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.899991 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.399974946 +0000 UTC m=+223.796028959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.912343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" event={"ID":"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459","Type":"ContainerStarted","Data":"b829eaa1a4b9abcca4a77046f2fb98ecf6de87c37a123b4fd7929bc57ce5e728"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.912391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" event={"ID":"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459","Type":"ContainerStarted","Data":"cc1de70dc4290eb9eb710be70cd796443996f1f016634cca43de0b717ab4d300"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.922783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" event={"ID":"753974fb-c7b2-4e2b-a62d-22544f357c9b","Type":"ContainerStarted","Data":"e57db9bd3698de6976d16ae2777d5f41258da142e2fb951445691d6b5897d94b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.922827 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" event={"ID":"753974fb-c7b2-4e2b-a62d-22544f357c9b","Type":"ContainerStarted","Data":"a36061103302c2542520327f3c17973f7b1bfc4519647805cca1f03fe28492d4"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.923646 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.924902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerStarted","Data":"2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.932017 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" podStartSLOduration=185.932002345 podStartE2EDuration="3m5.932002345s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.930384241 +0000 UTC m=+223.326438264" watchObservedRunningTime="2026-03-08 19:35:21.932002345 +0000 UTC m=+223.328056368" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.947032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"3f9619d94104fc326ea6c648cfb32059e55688e8bf9b2d4729e502fa84e1a431"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.954354 4885 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h2bp5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.954422 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" podUID="753974fb-c7b2-4e2b-a62d-22544f357c9b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.965966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jbwsr" event={"ID":"0a7420ef-f20d-4d48-a619-627327de2063","Type":"ContainerStarted","Data":"e18311321d3c0014e1426b869312b2990c8d412d6413e3cd283d62871252ec66"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.966008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jbwsr" event={"ID":"0a7420ef-f20d-4d48-a619-627327de2063","Type":"ContainerStarted","Data":"ee1100e97dbf89d4e7b1ea5eca189e6341bfe3b6b9ce78930e988e8849b1cc6e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.967873 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerStarted","Data":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.967891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerStarted","Data":"2ea70402b3dbdea12ac7aa07af023bd9134877c1cdbc64f413fbc103681b29c0"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.968473 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.974065 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.974104 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.979780 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59146: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.980435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.988848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" event={"ID":"fe3a8c81-8c1d-4b38-9cae-813fb749fd43","Type":"ContainerStarted","Data":"408a83629bea15447a284e212f600dc5a3061c22a8484e4eabb1b71b7b670ef5"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.989688 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" podStartSLOduration=185.98966224 podStartE2EDuration="3m5.98966224s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.980184467 +0000 UTC m=+223.376238490" watchObservedRunningTime="2026-03-08 19:35:21.98966224 +0000 UTC m=+223.385716263" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.000687 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.001828 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.501813796 +0000 UTC m=+223.897867819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.034982 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.035028 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.035501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerStarted","Data":"a7712e58e3657307c61d5c1c66c3ba86b19138bdd56e7c08c023008232e51c80"} Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.036493 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.049134 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" podStartSLOduration=186.049101004 podStartE2EDuration="3m6.049101004s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.048746335 +0000 UTC m=+223.444800348" watchObservedRunningTime="2026-03-08 19:35:22.049101004 +0000 UTC m=+223.445155027" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.051339 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" podStartSLOduration=186.051330293 podStartE2EDuration="3m6.051330293s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.019735487 +0000 UTC m=+223.415789500" watchObservedRunningTime="2026-03-08 19:35:22.051330293 +0000 UTC m=+223.447384316" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.066055 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.070389 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59156: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.075075 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jbwsr" podStartSLOduration=7.075059929 podStartE2EDuration="7.075059929s" podCreationTimestamp="2026-03-08 19:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.073908079 +0000 UTC m=+223.469962102" watchObservedRunningTime="2026-03-08 19:35:22.075059929 +0000 UTC m=+223.471113952" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.125191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.128112 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podStartSLOduration=186.128097171 podStartE2EDuration="3m6.128097171s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.124431914 +0000 UTC m=+223.520485937" watchObservedRunningTime="2026-03-08 19:35:22.128097171 +0000 UTC m=+223.524151204" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.132019 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.632002726 +0000 UTC m=+224.028056749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.156710 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" podStartSLOduration=187.156676748 podStartE2EDuration="3m7.156676748s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.155161597 +0000 UTC m=+223.551215620" watchObservedRunningTime="2026-03-08 19:35:22.156676748 +0000 UTC m=+223.552730771" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.200263 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59164: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.201088 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podStartSLOduration=187.201071577 podStartE2EDuration="3m7.201071577s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.200630426 +0000 UTC m=+223.596684449" watchObservedRunningTime="2026-03-08 19:35:22.201071577 +0000 UTC m=+223.597125600" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.233421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.233839 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.733823446 +0000 UTC m=+224.129877459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.337640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.338319 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.838307236 +0000 UTC m=+224.234361259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.439667 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.440066 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.940050694 +0000 UTC m=+224.336104717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.541469 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.541843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.041827602 +0000 UTC m=+224.437881625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.556545 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:22 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:22 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:22 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.556601 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.593213 4885 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-j7xfr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.593486 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" podUID="494bb437-45dd-48e3-b932-9c3645e493ef" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.642506 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.642637 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.142612074 +0000 UTC m=+224.538666097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.642803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.643115 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.143100037 +0000 UTC m=+224.539154060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.743747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.743937 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.243898219 +0000 UTC m=+224.639952242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.744385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.744697 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.24468917 +0000 UTC m=+224.640743193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.845686 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.845874 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.345851182 +0000 UTC m=+224.741905205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.846005 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.846284 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.346276343 +0000 UTC m=+224.742330366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.885004 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39194: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.946893 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.947038 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.447011914 +0000 UTC m=+224.843065937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.947095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.947390 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.447382313 +0000 UTC m=+224.843436336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.032789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" podStartSLOduration=187.032771062 podStartE2EDuration="3m7.032771062s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.295200111 +0000 UTC m=+223.691254134" watchObservedRunningTime="2026-03-08 19:35:23.032771062 +0000 UTC m=+224.428825085" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.032971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.033178 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" containerID="cri-o://9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" gracePeriod=30 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.047692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.048060 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.548041532 +0000 UTC m=+224.944095555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.049487 4885 generic.go:334] "Generic (PLEG): container finished" podID="1329795d-a8f9-4896-adba-23c2c0da9261" containerID="2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041" exitCode=0 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.049635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerDied","Data":"2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.058938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"2c361cdfc1cff12d50ee9ee085504ab66dd55d8189793f12bcc353982149dab2"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.066614 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"ca39148a9626339987a5aa2ed03d067640ce3ea5cfb1c18cbede6b6f556b0d08"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.070423 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"4c4e05bc564fcc66f0abbb235f48feb99eaafb896ab2420ec03c6029f7b0803c"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.081836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerStarted","Data":"b41eebef7022481e1ffd80510952eacb0584b1209a68ef76a2ea9db2aa437f4b"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"6092b260bdb36ebe73d894ceb7e9ad1d6a208e6765216dfd3aecb3017986764f"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"598a74675b4618713ba1057715801f585ba07f1c27c5278513b4da0da2659659"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.089200 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.113357 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" podStartSLOduration=187.113340653 podStartE2EDuration="3m7.113340653s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.113127747 +0000 UTC m=+224.509181770" watchObservedRunningTime="2026-03-08 19:35:23.113340653 +0000 UTC m=+224.509394676" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.117737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"fb6386a3971604b1cda9ebf974a2be03222ed9f37a2e6f751f3c4d327cd559dc"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.117771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"9c5b6434d50072b4bb4603a93d7f28a8de87f7df4b6ec699653d852a93d711a6"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.123086 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.123143 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.138372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.143210 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.146515 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.151260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.152302 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.652286907 +0000 UTC m=+225.048340940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.164467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.176362 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" podStartSLOduration=187.176345241 podStartE2EDuration="3m7.176345241s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.165122941 +0000 UTC m=+224.561176964" watchObservedRunningTime="2026-03-08 19:35:23.176345241 +0000 UTC m=+224.572399254" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.222789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s5pkw" podStartSLOduration=9.222776176 podStartE2EDuration="9.222776176s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.220644789 +0000 UTC m=+224.616698802" watchObservedRunningTime="2026-03-08 19:35:23.222776176 +0000 UTC m=+224.618830199" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.252442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.256567 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.756546721 +0000 UTC m=+225.152600744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.356136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.356492 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.856481601 +0000 UTC m=+225.252535624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.369910 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" podStartSLOduration=188.36989172 podStartE2EDuration="3m8.36989172s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.335060096 +0000 UTC m=+224.731114119" watchObservedRunningTime="2026-03-08 19:35:23.36989172 +0000 UTC m=+224.765945733" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.459422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.459964 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.959949303 +0000 UTC m=+225.356003316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.557371 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:23 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:23 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:23 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.557608 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.561430 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.561804 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.061786054 +0000 UTC m=+225.457840077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.648202 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.662884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.663389 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.163359607 +0000 UTC m=+225.559413630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.721555 4885 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766319 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766407 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.766915 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.266904843 +0000 UTC m=+225.662958866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.767641 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.768745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config" (OuterVolumeSpecName: "config") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.769105 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.781730 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm" (OuterVolumeSpecName: "kube-api-access-d98wm") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "kube-api-access-d98wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.782114 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.798986 4885 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-08T19:35:23.721580357Z","Handler":null,"Name":""} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.822090 4885 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.822121 4885 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.867827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868125 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868133 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868142 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868151 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.874115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.969197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.974174 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.974204 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.021356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.128040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.147410 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"dddedbc40f09532d80f83f22ccff53538b994d272f369aa4c46ad8964b06a01e"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.147460 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"8799e4a06d4c939fdc0004857b25547c74206974d9020f02811fdeb4e0bcc37f"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173136 4885 generic.go:334] "Generic (PLEG): container finished" podID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" exitCode=0 Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerDied","Data":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerDied","Data":"4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173266 4885 scope.go:117] "RemoveContainer" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174159 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174193 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174965 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" containerID="cri-o://8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" gracePeriod=30 Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.181637 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.200786 4885 scope.go:117] "RemoveContainer" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.202438 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": container with ID starting with 9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652 not found: ID does not exist" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.202485 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} err="failed to get container status \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": rpc error: code = NotFound desc = could not find container \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": container with ID starting with 9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652 not found: ID does not exist" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.214983 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" podStartSLOduration=10.214964854 podStartE2EDuration="10.214964854s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:24.188153545 +0000 UTC m=+225.584207568" watchObservedRunningTime="2026-03-08 19:35:24.214964854 +0000 UTC m=+225.611018877" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.236472 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39208: no serving certificate available for the kubelet" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.248883 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.256853 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.330728 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.331154 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331170 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.339506 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.339825 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.516418 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.519409 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.524260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.527382 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.560447 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:24 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:24 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:24 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.560499 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.590582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.591468 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.616912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.666410 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.679796 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.694219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.694507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.706382 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706392 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706486 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.707168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.715788 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.738346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.777074 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.795155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796248 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796417 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796594 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.797226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume" (OuterVolumeSpecName: "config-volume") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.801850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6" (OuterVolumeSpecName: "kube-api-access-k4fv6") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "kube-api-access-k4fv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.803278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.804897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.849766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899068 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899112 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899180 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899408 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899491 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899536 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899548 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899557 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.900027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.900780 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config" (OuterVolumeSpecName: "config") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.901015 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.901464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.905427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p" (OuterVolumeSpecName: "kube-api-access-lxk9p") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "kube-api-access-lxk9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907567 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907635 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.907855 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907993 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.908664 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.915820 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.927609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002098 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002537 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002719 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002733 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002745 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002758 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.060580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105849 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105969 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106390 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.123409 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: W0308 19:35:25.135201 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8fbc68_3714_4fe4_9f62_857c5dc05661.slice/crio-9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb WatchSource:0}: Error finding container 9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb: Status 404 returned error can't find the container with id 9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.193787 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerStarted","Data":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.193834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerStarted","Data":"d5dd902e3ef717231619c64b1b91e79b07a9f0b3233c92d0567cafca72b99c09"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.195098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.203466 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204241 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerDied","Data":"134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204329 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.207441 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208307 4885 generic.go:334] "Generic (PLEG): container finished" podID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" exitCode=0 Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerDied","Data":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerDied","Data":"40c776312848989ccb2eddcff1adbedd8af2200ce48ffdc1e4635ef09792719e"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208410 4885 scope.go:117] "RemoveContainer" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208502 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.211885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215071 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215953 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.216226 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.216389 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.222169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.228487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.229127 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.231333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.232258 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" podStartSLOduration=190.232241923 podStartE2EDuration="3m10.232241923s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:25.229483419 +0000 UTC m=+226.625537462" watchObservedRunningTime="2026-03-08 19:35:25.232241923 +0000 UTC m=+226.628295946" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.258372 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.278811 4885 scope.go:117] "RemoveContainer" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: E0308 19:35:25.279556 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": container with ID starting with 8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e not found: ID does not exist" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.279598 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} err="failed to get container status \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": rpc error: code = NotFound desc = could not find container \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": container with ID starting with 8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e not found: ID does not exist" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.291124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.297189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.303206 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.306134 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307766 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.380142 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" path="/var/lib/kubelet/pods/4c7583a8-a980-4ab2-a594-bf55ec72c91c/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.381013 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.381569 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" path="/var/lib/kubelet/pods/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409235 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.410501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411512 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.415890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.416181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.428361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.431307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.556326 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:25 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:25 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:25 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.556620 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.559316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.561831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.671015 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.794418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.091408 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:26 crc kubenswrapper[4885]: W0308 19:35:26.105789 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d700393_14b8_4abe_b77b_b2bfd718f024.slice/crio-f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf WatchSource:0}: Error finding container f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf: Status 404 returned error can't find the container with id f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.240452 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.240523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.242351 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerStarted","Data":"f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244485 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244565 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"e265209fea1ffc2ce0afb5176cc04ccaf2989324d8da3b88689366337825e2af"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252071 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"75e0661a1b23d764b6ba36fa438a7e6bc5398d8d64dd93e640a640cee2d85a8d"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerStarted","Data":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerStarted","Data":"bfeb46af4657b2486ffad869effd2392e9613e3a9f1f17a4e0c902d67a6fb26b"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.273304 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podStartSLOduration=3.27328499 podStartE2EDuration="3.27328499s" podCreationTimestamp="2026-03-08 19:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:26.272785506 +0000 UTC m=+227.668839529" watchObservedRunningTime="2026-03-08 19:35:26.27328499 +0000 UTC m=+227.669339023" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277243 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277319 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"de0e60604d3aa86bafd041642af24e9211dfd9322182b13ae9a6b56c608e4e2c"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.474424 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.507242 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.508204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.510099 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.544503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.555959 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:26 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:26 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:26 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.556246 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624503 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.726206 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.726238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.732751 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.733604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.736633 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.736941 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.748113 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.768172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.821910 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.827123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.827167 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.833690 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39224: no serving certificate available for the kubelet" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.904400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.905821 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.917078 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929465 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.971169 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.030830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.031016 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.031042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.056306 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.120329 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.120616 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.121764 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.121814 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.133309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.152998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211776 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211830 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211864 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211929 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.223040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.244355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.244540 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.253858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.291912 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerStarted","Data":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.293759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.298414 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.302345 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.312310 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podStartSLOduration=4.312289752 podStartE2EDuration="4.312289752s" podCreationTimestamp="2026-03-08 19:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:27.308202533 +0000 UTC m=+228.704256566" watchObservedRunningTime="2026-03-08 19:35:27.312289752 +0000 UTC m=+228.708343765" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.317090 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.317143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.331671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.509748 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.515121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.521509 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.549956 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.560668 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.564548 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:27 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:27 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:27 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.564592 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646614 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.748309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.748337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.782496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.803597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.804219 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.808648 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.811223 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.815743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.843044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.849679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.849747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.903627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.904556 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.912276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951114 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951228 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.983879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.026791 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052641 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.053126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.053215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.068829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.124304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.223845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.305892 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.557808 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:28 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:28 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:28 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.557850 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.759981 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39236: no serving certificate available for the kubelet" Mar 08 19:35:29 crc kubenswrapper[4885]: I0308 19:35:29.555381 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:29 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:29 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:29 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:29 crc kubenswrapper[4885]: I0308 19:35:29.555437 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:30 crc kubenswrapper[4885]: I0308 19:35:30.555181 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:30 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:30 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:30 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:30 crc kubenswrapper[4885]: I0308 19:35:30.555257 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.556503 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:31 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:31 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:31 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.556796 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.988346 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39246: no serving certificate available for the kubelet" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.651622 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:32 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:32 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:32 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.651695 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.829026 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.829114 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.861762 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.051151 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.664120 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:33 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:33 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:33 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.664185 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.287590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.349571 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerStarted","Data":"789329547b46208ee5dc38fb335a56f42b713329e6d11c7a30e5d4042c3f9ea3"} Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.361177 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.377578 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641 WatchSource:0}: Error finding container 2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641: Status 404 returned error can't find the container with id 2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641 Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.558419 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:34 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:34 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:34 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.558463 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.588169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.594525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.595903 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d3903bd_e9e1_4d1e_a03b_886542a8f32c.slice/crio-b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4 WatchSource:0}: Error finding container b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4: Status 404 returned error can't find the container with id b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4 Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.599960 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30ce2c5_2b53_47aa_8470_394dd0d6256a.slice/crio-abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798 WatchSource:0}: Error finding container abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798: Status 404 returned error can't find the container with id abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798 Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.642409 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.657754 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.679602 4885 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ftkzn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680088 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podUID="5a244e04-1aec-4355-89c5-794667b5969f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680839 4885 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ftkzn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680892 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podUID="5a244e04-1aec-4355-89c5-794667b5969f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.806836 4885 patch_prober.go:28] interesting pod/controller-manager-6dfb477bd6-857d4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.806899 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840543 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840591 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840810 4885 patch_prober.go:28] interesting pod/route-controller-manager-5b7fc57755-9b4xh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840886 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.845139 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4258948150/2\": happened during read: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.845254 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdjrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-59wjr_openshift-marketplace(7346fb7f-6125-49c7-a422-cc169bc7e045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4258948150/2\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.846224 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:44 crc kubenswrapper[4885]: [+]has-synced ok Mar 08 19:35:44 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:44 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.846672 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.846434 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage4258948150/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.846872 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3548377267/3\": happened during read: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.847124 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cp8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gnjnd_openshift-marketplace(2a6b85b3-0bb1-4199-983f-615a6c932f09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3548377267/3\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.849341 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3548377267/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.869983 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.870214 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.876152 4885 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="9.385s" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.876284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerStarted","Data":"b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.908318 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.908574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerStarted","Data":"abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.910760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.920572 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.926741 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.926951 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" containerID="cri-o://a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" gracePeriod=30 Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.930279 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.930421 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" containerID="cri-o://c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" gracePeriod=30 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.426110 4885 csr.go:261] certificate signing request csr-27k6w is approved, waiting to be issued Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.436047 4885 csr.go:257] certificate signing request csr-27k6w is issued Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.462253 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.466785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.557815 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.560320 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627428 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627741 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627809 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627839 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca" (OuterVolumeSpecName: "client-ca") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628880 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config" (OuterVolumeSpecName: "config") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.629108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.629501 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config" (OuterVolumeSpecName: "config") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.634870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.637610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.639162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7" (OuterVolumeSpecName: "kube-api-access-q7qs7") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "kube-api-access-q7qs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.644289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88" (OuterVolumeSpecName: "kube-api-access-krn88") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "kube-api-access-krn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730377 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730429 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730451 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730468 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730486 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730502 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730519 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730536 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730551 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.928664 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.928743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931136 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerID="0ee19fdd420f7949dd0bf49daa93e801fc425de54a37700ece1f21c1a92a4055" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerDied","Data":"0ee19fdd420f7949dd0bf49daa93e801fc425de54a37700ece1f21c1a92a4055"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerStarted","Data":"464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934088 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerDied","Data":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerDied","Data":"f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934174 4885 scope.go:117] "RemoveContainer" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934324 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.936895 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.937050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939063 4885 generic.go:334] "Generic (PLEG): container finished" podID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerDied","Data":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939236 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerDied","Data":"bfeb46af4657b2486ffad869effd2392e9613e3a9f1f17a4e0c902d67a6fb26b"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.943087 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.943157 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.945834 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.946044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.946094 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerStarted","Data":"731327ad0ac3fd48c5dcf825c4aabc506f0114149e811eabdfb465d917e7e122"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.948324 4885 generic.go:334] "Generic (PLEG): container finished" podID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerID="67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.948392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerDied","Data":"67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.952045 4885 generic.go:334] "Generic (PLEG): container finished" podID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerID="982a06c5fb1623eb2d10a88c4dd26658f16bff03da78f1ab525cd051e22aaa20" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.952205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerDied","Data":"982a06c5fb1623eb2d10a88c4dd26658f16bff03da78f1ab525cd051e22aaa20"} Mar 08 19:35:45 crc kubenswrapper[4885]: E0308 19:35:45.956944 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" Mar 08 19:35:45 crc kubenswrapper[4885]: E0308 19:35:45.958293 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.024704 4885 scope.go:117] "RemoveContainer" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.025249 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": container with ID starting with a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16 not found: ID does not exist" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.025307 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} err="failed to get container status \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": rpc error: code = NotFound desc = could not find container \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": container with ID starting with a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16 not found: ID does not exist" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.025328 4885 scope.go:117] "RemoveContainer" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.047676 4885 scope.go:117] "RemoveContainer" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.048154 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": container with ID starting with c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534 not found: ID does not exist" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.048226 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} err="failed to get container status \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": rpc error: code = NotFound desc = could not find container \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": container with ID starting with c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534 not found: ID does not exist" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.139053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.142476 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.151704 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.157105 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.438054 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 17:37:17.831812975 +0000 UTC Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.438094 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7534h1m31.393721884s for next certificate rotation Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.494602 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.495052 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495075 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.495108 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495314 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495336 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.496111 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.499826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.499875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500298 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500576 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500862 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.504469 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.513362 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.521038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.528898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.529160 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.530914 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.531331 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.531971 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.532152 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.532060 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.542601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.553601 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644660 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644687 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644741 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644772 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644800 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745573 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745777 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.746035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.746095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.747589 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.748811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.748994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.750334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.751616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.756445 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.756456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.771856 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.774531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.831673 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.852136 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.082989 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:47 crc kubenswrapper[4885]: W0308 19:35:47.101600 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5c0c11_15e0_47ed_817c_939899828e1e.slice/crio-aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041 WatchSource:0}: Error finding container aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041: Status 404 returned error can't find the container with id aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041 Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.121186 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.121244 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.243418 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.249196 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.252434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.328490 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:47 crc kubenswrapper[4885]: W0308 19:35:47.336479 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab31615e_dc70_40ba_9b47_6a3f119e91d9.slice/crio-8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759 WatchSource:0}: Error finding container 8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759: Status 404 returned error can't find the container with id 8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759 Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360852 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360883 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360939 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.361332 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0d3903bd-e9e1-4d1e-a03b-886542a8f32c" (UID: "0d3903bd-e9e1-4d1e-a03b-886542a8f32c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.361380 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" (UID: "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.366750 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" (UID: "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.367301 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx" (OuterVolumeSpecName: "kube-api-access-qtctx") pod "60da1edb-8474-4368-a6ae-0bb2b1b7b845" (UID: "60da1edb-8474-4368-a6ae-0bb2b1b7b845"). InnerVolumeSpecName "kube-api-access-qtctx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.368450 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0d3903bd-e9e1-4d1e-a03b-886542a8f32c" (UID: "0d3903bd-e9e1-4d1e-a03b-886542a8f32c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.376332 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" path="/var/lib/kubelet/pods/26f8bff1-96c5-4c44-8e09-8f9785072c99/volumes" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.377153 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" path="/var/lib/kubelet/pods/7d700393-14b8-4abe-b77b-b2bfd718f024/volumes" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.438577 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 09:55:51.588079722 +0000 UTC Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.438892 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6686h20m4.149191463s for next certificate rotation Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463093 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463128 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463141 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463155 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463165 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.966986 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerDied","Data":"b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.967020 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.967071 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969444 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerStarted","Data":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerStarted","Data":"aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969825 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerStarted","Data":"7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerStarted","Data":"8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971441 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerDied","Data":"464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973459 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerDied","Data":"37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975365 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975368 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.981343 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.987342 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:48 crc kubenswrapper[4885]: I0308 19:35:48.004766 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" podStartSLOduration=4.004738468 podStartE2EDuration="4.004738468s" podCreationTimestamp="2026-03-08 19:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:47.9992101 +0000 UTC m=+249.395264203" watchObservedRunningTime="2026-03-08 19:35:48.004738468 +0000 UTC m=+249.400792531" Mar 08 19:35:48 crc kubenswrapper[4885]: I0308 19:35:48.026224 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" podStartSLOduration=4.026209123 podStartE2EDuration="4.026209123s" podCreationTimestamp="2026-03-08 19:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:48.02459011 +0000 UTC m=+249.420644133" watchObservedRunningTime="2026-03-08 19:35:48.026209123 +0000 UTC m=+249.422263146" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.494420 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4151231740/3\": happened during read: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.494897 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs66k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xpctw_openshift-marketplace(7d8fbc68-3714-4fe4-9f62-857c5dc05661): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4151231740/3\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.496150 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage4151231740/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.519562 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2691475163/2\": happened during read: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.519777 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhfws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t9fn7_openshift-marketplace(038004f7-92de-42b0-8951-447dfdaf2f83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2691475163/2\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.521058 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2691475163/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.018874 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.018943 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.216280 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.127280 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.134451 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.699166 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149189 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149507 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149559 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149574 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149595 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149609 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149810 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150071 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150607 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.154848 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.154900 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.155050 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.180754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.184319 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.282051 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.316671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.487351 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.744636 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:01 crc kubenswrapper[4885]: I0308 19:36:01.052483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerStarted","Data":"d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843"} Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.600093 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.600428 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" containerID="cri-o://e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" gracePeriod=30 Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.693970 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.694345 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" containerID="cri-o://7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" gracePeriod=30 Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.819448 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.819510 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.962785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036199 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036288 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036348 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036400 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config" (OuterVolumeSpecName: "config") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.044222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j" (OuterVolumeSpecName: "kube-api-access-cm58j") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "kube-api-access-cm58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.048558 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064584 4885 generic.go:334] "Generic (PLEG): container finished" podID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" exitCode=0 Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064666 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerDied","Data":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerDied","Data":"aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064843 4885 scope.go:117] "RemoveContainer" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067054 4885 generic.go:334] "Generic (PLEG): container finished" podID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerID="7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" exitCode=0 Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerDied","Data":"7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerDied","Data":"8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067130 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.078468 4885 scope.go:117] "RemoveContainer" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: E0308 19:36:03.078769 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": container with ID starting with e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b not found: ID does not exist" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.078793 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} err="failed to get container status \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": rpc error: code = NotFound desc = could not find container \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": container with ID starting with e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b not found: ID does not exist" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.079195 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.119265 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.121772 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138345 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138647 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138659 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138690 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138713 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138730 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.139163 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config" (OuterVolumeSpecName: "config") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.139157 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.140873 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv" (OuterVolumeSpecName: "kube-api-access-c67vv") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "kube-api-access-c67vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.140880 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239569 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239629 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239665 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239683 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.384207 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" path="/var/lib/kubelet/pods/0d5c0c11-15e0-47ed-817c-939899828e1e/volumes" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.074723 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.102995 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.107989 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.518805 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:04 crc kubenswrapper[4885]: E0308 19:36:04.519152 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519168 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: E0308 19:36:04.519645 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519658 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519790 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519801 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.522419 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.528402 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.528710 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.529672 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.530853 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531623 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531713 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531969 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.533289 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.540963 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.541163 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.546119 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.580744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.580911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581474 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583151 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583832 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583551 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.584244 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.584472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684337 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.685774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.686183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.686422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.687935 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.690613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.692030 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.692052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.703551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.704032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.908264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.922960 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.241407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:05 crc kubenswrapper[4885]: E0308 19:36:05.331032 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.348085 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.383621 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" path="/var/lib/kubelet/pods/ab31615e-dc70-40ba-9b47-6a3f119e91d9/volumes" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.586161 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.586868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.589330 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.589610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.600044 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.700861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.700930 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.818762 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.905285 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.098786 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerStarted","Data":"d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.098825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerStarted","Data":"51d88566446210e3b83617f522140b5e6b40693eb161e53013b9f354546417e9"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.099504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.102507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerStarted","Data":"f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.102546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerStarted","Data":"6e9fa8f3dc6a11417fee7c798a28fbc65a8413f8866bef37973386caf66851c6"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.103102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.107297 4885 generic.go:334] "Generic (PLEG): container finished" podID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerID="0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff" exitCode=0 Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.107336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerDied","Data":"0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.115815 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podStartSLOduration=4.115798223 podStartE2EDuration="4.115798223s" podCreationTimestamp="2026-03-08 19:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:06.112959026 +0000 UTC m=+267.509013049" watchObservedRunningTime="2026-03-08 19:36:06.115798223 +0000 UTC m=+267.511852246" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.122669 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.133198 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.138029 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podStartSLOduration=4.138016768 podStartE2EDuration="4.138016768s" podCreationTimestamp="2026-03-08 19:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:06.133390834 +0000 UTC m=+267.529444857" watchObservedRunningTime="2026-03-08 19:36:06.138016768 +0000 UTC m=+267.534070791" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.216533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:06 crc kubenswrapper[4885]: W0308 19:36:06.274638 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5016b299_d93d_4bf9_9e51_a10e68da79bc.slice/crio-a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6 WatchSource:0}: Error finding container a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6: Status 404 returned error can't find the container with id a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6 Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.117750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerStarted","Data":"7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870"} Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.118139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerStarted","Data":"a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6"} Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.448117 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.464201 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.464185318 podStartE2EDuration="2.464185318s" podCreationTimestamp="2026-03-08 19:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:07.138474447 +0000 UTC m=+268.534528470" watchObservedRunningTime="2026-03-08 19:36:07.464185318 +0000 UTC m=+268.860239341" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.527210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.532753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw" (OuterVolumeSpecName: "kube-api-access-zbjqw") pod "5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" (UID: "5a29b0f8-8eee-4c05-9bbe-bebb70f16e58"). InnerVolumeSpecName "kube-api-access-zbjqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.628787 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.128064 4885 generic.go:334] "Generic (PLEG): container finished" podID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerID="7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870" exitCode=0 Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.128156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerDied","Data":"7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870"} Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.132845 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.136696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerDied","Data":"d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843"} Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.136735 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045048 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:12 crc kubenswrapper[4885]: E0308 19:36:12.045579 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045689 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.046107 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.060099 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234471 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234527 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.337378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.353652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.403115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:15 crc kubenswrapper[4885]: E0308 19:36:15.475448 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.334406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395832 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"5016b299-d93d-4bf9-9e51-a10e68da79bc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395971 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5016b299-d93d-4bf9-9e51-a10e68da79bc" (UID: "5016b299-d93d-4bf9-9e51-a10e68da79bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395994 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"5016b299-d93d-4bf9-9e51-a10e68da79bc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.396351 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.404172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5016b299-d93d-4bf9-9e51-a10e68da79bc" (UID: "5016b299-d93d-4bf9-9e51-a10e68da79bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.497305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093619 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerDied","Data":"a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6"} Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093656 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6" Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093672 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:25 crc kubenswrapper[4885]: E0308 19:36:25.587037 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:29 crc kubenswrapper[4885]: I0308 19:36:29.400585 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.345093 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.345470 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn2mq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cptvd_openshift-marketplace(b30ce2c5-2b53-47aa-8470-394dd0d6256a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.346696 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.335034 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.460123 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.460676 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdjrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-59wjr_openshift-marketplace(7346fb7f-6125-49c7-a422-cc169bc7e045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.462233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.471439 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.471585 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x76vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pqxt7_openshift-marketplace(8881ba5e-d9d1-42a9-98af-849e72053757): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.472985 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.493692 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.493857 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-st8f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-62xgk_openshift-marketplace(05666e0b-c4ce-451a-ba67-ddb78866ef54): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.495469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.735145 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846531 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846894 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846983 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.847499 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.847555 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" gracePeriod=600 Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.169059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerStarted","Data":"d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.170401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.171801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.174674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.176132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.177644 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" exitCode=0 Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.177730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} Mar 08 19:36:33 crc kubenswrapper[4885]: E0308 19:36:33.178601 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" Mar 08 19:36:33 crc kubenswrapper[4885]: E0308 19:36:33.180719 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.185557 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.185688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.189280 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.189383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.194661 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.194738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.205206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.207025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerStarted","Data":"3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.255694 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=22.255661059 podStartE2EDuration="22.255661059s" podCreationTimestamp="2026-03-08 19:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:34.235142618 +0000 UTC m=+295.631196681" watchObservedRunningTime="2026-03-08 19:36:34.255661059 +0000 UTC m=+295.651715082" Mar 08 19:36:35 crc kubenswrapper[4885]: I0308 19:36:35.220466 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" exitCode=0 Mar 08 19:36:35 crc kubenswrapper[4885]: I0308 19:36:35.220517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} Mar 08 19:36:35 crc kubenswrapper[4885]: E0308 19:36:35.690696 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.234426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7"} Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.246139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.259281 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t9fn7" podStartSLOduration=3.470038712 podStartE2EDuration="1m13.259262465s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.264941907 +0000 UTC m=+227.660995930" lastFinishedPulling="2026-03-08 19:36:36.05416566 +0000 UTC m=+297.450219683" observedRunningTime="2026-03-08 19:36:37.25348836 +0000 UTC m=+298.649542413" watchObservedRunningTime="2026-03-08 19:36:37.259262465 +0000 UTC m=+298.655316488" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.254365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.256853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.293528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xpctw" podStartSLOduration=3.2402418 podStartE2EDuration="1m14.293481558s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.241822136 +0000 UTC m=+227.637876149" lastFinishedPulling="2026-03-08 19:36:37.295061884 +0000 UTC m=+298.691115907" observedRunningTime="2026-03-08 19:36:38.293243772 +0000 UTC m=+299.689297795" watchObservedRunningTime="2026-03-08 19:36:38.293481558 +0000 UTC m=+299.689535611" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.307008 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnjnd" podStartSLOduration=3.525347463 podStartE2EDuration="1m14.30696809s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.278605963 +0000 UTC m=+227.674659996" lastFinishedPulling="2026-03-08 19:36:37.06022659 +0000 UTC m=+298.456280623" observedRunningTime="2026-03-08 19:36:38.30657801 +0000 UTC m=+299.702632033" watchObservedRunningTime="2026-03-08 19:36:38.30696809 +0000 UTC m=+299.703022113" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.324204 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prdq9" podStartSLOduration=19.747049504 podStartE2EDuration="1m11.324189312s" podCreationTimestamp="2026-03-08 19:35:27 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.931406339 +0000 UTC m=+247.327460372" lastFinishedPulling="2026-03-08 19:36:37.508546157 +0000 UTC m=+298.904600180" observedRunningTime="2026-03-08 19:36:38.321537931 +0000 UTC m=+299.717591954" watchObservedRunningTime="2026-03-08 19:36:38.324189312 +0000 UTC m=+299.720243335" Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.534645 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.535216 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" containerID="cri-o://f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" gracePeriod=30 Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.641318 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.641591 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" containerID="cri-o://d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" gracePeriod=30 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.289379 4885 generic.go:334] "Generic (PLEG): container finished" podID="97017523-4956-4bde-84a3-5859a2edf389" containerID="f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" exitCode=0 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.289515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerDied","Data":"f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042"} Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.292585 4885 generic.go:334] "Generic (PLEG): container finished" podID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerID="d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" exitCode=0 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.292671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerDied","Data":"d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456"} Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.803459 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.803514 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.851576 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.851611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.908768 4885 patch_prober.go:28] interesting pod/controller-manager-d6687ccbf-8tvf8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.908811 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.061318 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.061747 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.132410 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.162788 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: E0308 19:36:45.163063 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163078 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: E0308 19:36:45.163109 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163118 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163237 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163254 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163729 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.173589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.231023 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.300563 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.301039 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerDied","Data":"6e9fa8f3dc6a11417fee7c798a28fbc65a8413f8866bef37973386caf66851c6"} Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.301073 4885 scope.go:117] "RemoveContainer" containerID="f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.306999 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.307008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerDied","Data":"51d88566446210e3b83617f522140b5e6b40693eb161e53013b9f354546417e9"} Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317557 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config" (OuterVolumeSpecName: "config") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318857 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318867 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc" (OuterVolumeSpecName: "kube-api-access-vtmxc") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "kube-api-access-vtmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323799 4885 scope.go:117] "RemoveContainer" containerID="d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323971 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.396703 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.399013 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.406362 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.422399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config" (OuterVolumeSpecName: "config") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca" (OuterVolumeSpecName: "client-ca") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423823 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423839 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423852 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423865 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423878 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.425430 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.426106 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.426151 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.427452 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf" (OuterVolumeSpecName: "kube-api-access-87mqf") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "kube-api-access-87mqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.428382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.446907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.452140 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.452739 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.482158 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.524910 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.524967 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.651244 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.654228 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.656635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.658869 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.691589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: W0308 19:36:45.692666 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb8d3c0_c05c_4557_b84d_94c2b20add8e.slice/crio-8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569 WatchSource:0}: Error finding container 8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569: Status 404 returned error can't find the container with id 8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569 Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.924040 4885 patch_prober.go:28] interesting pod/route-controller-manager-6c8756cb88-5rd25 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.924105 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:36:46 crc kubenswrapper[4885]: I0308 19:36:46.321663 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerStarted","Data":"8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569"} Mar 08 19:36:46 crc kubenswrapper[4885]: E0308 19:36:46.371501 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:36:46 crc kubenswrapper[4885]: I0308 19:36:46.379046 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.333295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerStarted","Data":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.357941 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" podStartSLOduration=5.357895835 podStartE2EDuration="5.357895835s" podCreationTimestamp="2026-03-08 19:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:47.357197107 +0000 UTC m=+308.753251130" watchObservedRunningTime="2026-03-08 19:36:47.357895835 +0000 UTC m=+308.753949868" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.380275 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" path="/var/lib/kubelet/pods/0514086c-91a5-4b80-8920-b2b78d4faba3/volumes" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.381411 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97017523-4956-4bde-84a3-5859a2edf389" path="/var/lib/kubelet/pods/97017523-4956-4bde-84a3-5859a2edf389/volumes" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.554915 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:47 crc kubenswrapper[4885]: E0308 19:36:47.555269 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.555292 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.555474 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.556077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.566229 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.566621 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574246 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574345 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574482 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.584691 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.585557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.625434 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650795 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.651023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.651163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.754050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.754134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.755266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.755907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.758380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.763616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.790053 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.843658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.844025 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.886348 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.916176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.348423 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" containerID="cri-o://27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" gracePeriod=2 Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.349831 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.356996 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.361630 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.433276 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.355715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerStarted","Data":"84690645be4eef59546f1222f77afbe3b95d771a6460611313ea0fa767e54a31"} Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.358847 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" exitCode=0 Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.358939 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.196834 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222706 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.226764 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities" (OuterVolumeSpecName: "utilities") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.234666 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws" (OuterVolumeSpecName: "kube-api-access-zhfws") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "kube-api-access-zhfws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.324765 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.324812 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.328352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.373407 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"75e0661a1b23d764b6ba36fa438a7e6bc5398d8d64dd93e640a640cee2d85a8d"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerStarted","Data":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377562 4885 scope.go:117] "RemoveContainer" containerID="27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.411378 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" podStartSLOduration=9.411364815 podStartE2EDuration="9.411364815s" podCreationTimestamp="2026-03-08 19:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:51.410335538 +0000 UTC m=+312.806389551" watchObservedRunningTime="2026-03-08 19:36:51.411364815 +0000 UTC m=+312.807418838" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.422476 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.425300 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.427133 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.429008 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.844745 4885 scope.go:117] "RemoveContainer" containerID="119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1" Mar 08 19:36:52 crc kubenswrapper[4885]: I0308 19:36:52.109031 4885 scope.go:117] "RemoveContainer" containerID="3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19" Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.376637 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" path="/var/lib/kubelet/pods/038004f7-92de-42b0-8951-447dfdaf2f83/volumes" Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.394540 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.394655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7"} Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.397455 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.397550 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1"} Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.401276 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.401470 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.411692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerStarted","Data":"47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.415082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerStarted","Data":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.418198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerStarted","Data":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.441581 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cptvd" podStartSLOduration=20.574997071 podStartE2EDuration="1m28.441553915s" podCreationTimestamp="2026-03-08 19:35:26 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.945204319 +0000 UTC m=+247.341258352" lastFinishedPulling="2026-03-08 19:36:53.811761163 +0000 UTC m=+315.207815196" observedRunningTime="2026-03-08 19:36:54.438569864 +0000 UTC m=+315.834623897" watchObservedRunningTime="2026-03-08 19:36:54.441553915 +0000 UTC m=+315.837607938" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.459081 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" containerID="cri-o://4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" gracePeriod=15 Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.490548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62xgk" podStartSLOduration=20.652805907 podStartE2EDuration="1m28.490525828s" podCreationTimestamp="2026-03-08 19:35:26 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.94751796 +0000 UTC m=+247.343571983" lastFinishedPulling="2026-03-08 19:36:53.785237841 +0000 UTC m=+315.181291904" observedRunningTime="2026-03-08 19:36:54.486367697 +0000 UTC m=+315.882421720" watchObservedRunningTime="2026-03-08 19:36:54.490525828 +0000 UTC m=+315.886579851" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.490665 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqxt7" podStartSLOduration=19.610406121 podStartE2EDuration="1m27.490659682s" podCreationTimestamp="2026-03-08 19:35:27 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.941191531 +0000 UTC m=+247.337245584" lastFinishedPulling="2026-03-08 19:36:53.821445112 +0000 UTC m=+315.217499145" observedRunningTime="2026-03-08 19:36:54.462831906 +0000 UTC m=+315.858885939" watchObservedRunningTime="2026-03-08 19:36:54.490659682 +0000 UTC m=+315.886713715" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.836045 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888576 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888597 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888701 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888805 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888849 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.890000 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.890093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.892441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.892518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.893164 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.902584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.903473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.904070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.904327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.905275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.905508 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.906831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.908762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.916043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb" (OuterVolumeSpecName: "kube-api-access-sjdvb") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "kube-api-access-sjdvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.990753 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991070 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991507 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991608 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991707 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991807 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991936 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992039 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992126 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992221 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992306 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992394 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992487 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992588 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423682 4885 generic.go:334] "Generic (PLEG): container finished" podID="d008be41-8eac-496a-9c3d-083014dc402c" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" exitCode=0 Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerDied","Data":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerDied","Data":"f8be22b6ff0acffcb9ccf46c1c57e368d284120713d460abb8b81012323f6dcf"} Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423759 4885 scope.go:117] "RemoveContainer" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423857 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.440413 4885 scope.go:117] "RemoveContainer" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: E0308 19:36:55.441374 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": container with ID starting with 4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72 not found: ID does not exist" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.441406 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} err="failed to get container status \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": rpc error: code = NotFound desc = could not find container \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": container with ID starting with 4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72 not found: ID does not exist" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.458152 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.468884 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.822548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.822607 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.877455 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.223357 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.223706 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.275446 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.376205 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d008be41-8eac-496a-9c3d-083014dc402c" path="/var/lib/kubelet/pods/d008be41-8eac-496a-9c3d-083014dc402c/volumes" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.224419 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.224474 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.447639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.294685 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" probeResult="failure" output=< Mar 08 19:36:59 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:36:59 crc kubenswrapper[4885]: > Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.457394 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" exitCode=0 Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.457456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.568798 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570086 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570265 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-utilities" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570285 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-utilities" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570357 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-content" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570374 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-content" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570442 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570456 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.571217 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.571262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.572192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.582853 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583169 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583432 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583609 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583905 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.584419 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.585839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.586188 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.586803 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587352 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587865 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.596305 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.604251 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.609253 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.617988 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652624 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652786 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652824 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652857 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653143 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653516 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.754557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.755884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.755988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756531 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756582 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756686 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.758068 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.760153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.760879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.761742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.762251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.766327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.767830 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.767880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.768303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.768851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.769098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.770254 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.773412 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.783238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.928428 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.377164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.465915 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" event={"ID":"0a3ef361-6968-4088-a2f1-ca52eb3715b6","Type":"ContainerStarted","Data":"c963215a061577702d5c9e97cdfac6b159d77b238aa349674f278b04c35add44"} Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.468842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.479271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" event={"ID":"0a3ef361-6968-4088-a2f1-ca52eb3715b6","Type":"ContainerStarted","Data":"b84642920228051864b70e57a044678fd7a3795095fb55824c922cf77c0242b1"} Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.479662 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.489113 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.513275 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59wjr" podStartSLOduration=3.795651429 podStartE2EDuration="1m37.513251593s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.264809003 +0000 UTC m=+227.660863026" lastFinishedPulling="2026-03-08 19:36:59.982409127 +0000 UTC m=+321.378463190" observedRunningTime="2026-03-08 19:37:00.48838211 +0000 UTC m=+321.884436173" watchObservedRunningTime="2026-03-08 19:37:01.513251593 +0000 UTC m=+322.909305656" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.516314 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" podStartSLOduration=32.516300135 podStartE2EDuration="32.516300135s" podCreationTimestamp="2026-03-08 19:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:01.510450698 +0000 UTC m=+322.906504751" watchObservedRunningTime="2026-03-08 19:37:01.516300135 +0000 UTC m=+322.912354198" Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.573571 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.573903 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" containerID="cri-o://c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" gracePeriod=30 Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.592228 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.592477 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" containerID="cri-o://e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" gracePeriod=30 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.166057 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.168506 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306140 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306190 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca" (OuterVolumeSpecName: "client-ca") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config" (OuterVolumeSpecName: "config") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308012 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config" (OuterVolumeSpecName: "config") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308038 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308505 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308544 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308562 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308577 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.310755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh" (OuterVolumeSpecName: "kube-api-access-zwkdh") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "kube-api-access-zwkdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.310946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.311249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss" (OuterVolumeSpecName: "kube-api-access-ptfss") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "kube-api-access-ptfss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.311293 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.409982 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410010 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410020 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410030 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410041 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494835 4885 generic.go:334] "Generic (PLEG): container finished" podID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" exitCode=0 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerDied","Data":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494949 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerDied","Data":"8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494971 4885 scope.go:117] "RemoveContainer" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494968 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499110 4885 generic.go:334] "Generic (PLEG): container finished" podID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" exitCode=0 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499222 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerDied","Data":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerDied","Data":"84690645be4eef59546f1222f77afbe3b95d771a6460611313ea0fa767e54a31"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.533868 4885 scope.go:117] "RemoveContainer" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: E0308 19:37:03.535062 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": container with ID starting with e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef not found: ID does not exist" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.535112 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} err="failed to get container status \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": rpc error: code = NotFound desc = could not find container \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": container with ID starting with e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef not found: ID does not exist" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.535145 4885 scope.go:117] "RemoveContainer" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.538272 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.548388 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.553455 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.557842 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.558516 4885 scope.go:117] "RemoveContainer" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: E0308 19:37:03.559161 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": container with ID starting with c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1 not found: ID does not exist" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.559208 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} err="failed to get container status \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": rpc error: code = NotFound desc = could not find container \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": container with ID starting with c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1 not found: ID does not exist" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.573984 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:04 crc kubenswrapper[4885]: E0308 19:37:04.574366 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574393 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: E0308 19:37:04.574439 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574457 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574668 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574707 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.575538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.579030 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.579885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.580455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582533 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582666 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.583180 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.584276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.591070 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.591325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.594156 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595268 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595603 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595792 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.596200 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.596301 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.597648 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729784 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730040 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.831800 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.831993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832117 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835115 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835648 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835756 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.836257 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.836564 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.839405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.841129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.871235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.871308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.937756 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.944020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.218141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.228804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.229078 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: W0308 19:37:05.230437 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod563046aa_bf59_4d04_bc57_ab1250051468.slice/crio-ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c WatchSource:0}: Error finding container ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c: Status 404 returned error can't find the container with id ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.296197 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.298467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: W0308 19:37:05.302868 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode160affd_520c_4dad_89d4_d2a511c8a545.slice/crio-7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf WatchSource:0}: Error finding container 7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf: Status 404 returned error can't find the container with id 7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.383130 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" path="/var/lib/kubelet/pods/20ebec2c-a495-4d93-b35f-ddd022a21564/volumes" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.384097 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" path="/var/lib/kubelet/pods/7bb8d3c0-c05c-4557-b84d-94c2b20add8e/volumes" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" event={"ID":"563046aa-bf59-4d04-bc57-ab1250051468","Type":"ContainerStarted","Data":"66650e65234d81333ec0a7ba90489540e0f4da97f51390243281f124944d530c"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" event={"ID":"563046aa-bf59-4d04-bc57-ab1250051468","Type":"ContainerStarted","Data":"ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520602 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.524499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" event={"ID":"e160affd-520c-4dad-89d4-d2a511c8a545","Type":"ContainerStarted","Data":"1f309038fd62bb3ff6b0bf5d6e8030902e0042a22ceb1bbd2bf6e7f8c5695924"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.524569 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" event={"ID":"e160affd-520c-4dad-89d4-d2a511c8a545","Type":"ContainerStarted","Data":"7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.527155 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.542145 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" podStartSLOduration=3.542125083 podStartE2EDuration="3.542125083s" podCreationTimestamp="2026-03-08 19:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:05.541335022 +0000 UTC m=+326.937389275" watchObservedRunningTime="2026-03-08 19:37:05.542125083 +0000 UTC m=+326.938179106" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.567724 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" podStartSLOduration=3.567702334 podStartE2EDuration="3.567702334s" podCreationTimestamp="2026-03-08 19:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:05.564501579 +0000 UTC m=+326.960555602" watchObservedRunningTime="2026-03-08 19:37:05.567702334 +0000 UTC m=+326.963756357" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.578519 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.807141 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.531732 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.541083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.905697 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:37:07 crc kubenswrapper[4885]: I0308 19:37:07.300105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:07 crc kubenswrapper[4885]: I0308 19:37:07.535965 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" containerID="cri-o://51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" gracePeriod=2 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.062137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.180979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.181106 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.181174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.182620 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities" (OuterVolumeSpecName: "utilities") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.186834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg" (OuterVolumeSpecName: "kube-api-access-bdjrg") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "kube-api-access-bdjrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.206703 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.206942 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" containerID="cri-o://47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" gracePeriod=2 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.265344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284072 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284222 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284324 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.288073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.336237 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.548807 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" exitCode=0 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.548970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560553 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" exitCode=0 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560677 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560705 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"e265209fea1ffc2ce0afb5176cc04ccaf2989324d8da3b88689366337825e2af"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560758 4885 scope.go:117] "RemoveContainer" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.584756 4885 scope.go:117] "RemoveContainer" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.602709 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.606088 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.634596 4885 scope.go:117] "RemoveContainer" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652481 4885 scope.go:117] "RemoveContainer" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.652868 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": container with ID starting with 51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2 not found: ID does not exist" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652943 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} err="failed to get container status \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": rpc error: code = NotFound desc = could not find container \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": container with ID starting with 51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2 not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652981 4885 scope.go:117] "RemoveContainer" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.653347 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": container with ID starting with 0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58 not found: ID does not exist" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653382 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} err="failed to get container status \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": rpc error: code = NotFound desc = could not find container \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": container with ID starting with 0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58 not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653414 4885 scope.go:117] "RemoveContainer" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.653643 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": container with ID starting with 3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd not found: ID does not exist" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653681 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd"} err="failed to get container status \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": rpc error: code = NotFound desc = could not find container \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": container with ID starting with 3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.732071 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894586 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.896068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities" (OuterVolumeSpecName: "utilities") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.900617 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq" (OuterVolumeSpecName: "kube-api-access-jn2mq") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "kube-api-access-jn2mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.917398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997321 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997373 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997395 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.381182 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" path="/var/lib/kubelet/pods/7346fb7f-6125-49c7-a422-cc169bc7e045/volumes" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798"} Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570247 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570263 4885 scope.go:117] "RemoveContainer" containerID="47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.591146 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.592629 4885 scope.go:117] "RemoveContainer" containerID="189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.603016 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.617283 4885 scope.go:117] "RemoveContainer" containerID="539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265248 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265864 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265886 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265949 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265984 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265996 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266018 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266076 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266088 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266248 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266265 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266787 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267241 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267332 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267331 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267396 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267450 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271359 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271795 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271842 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271879 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271974 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271994 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272021 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272039 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272068 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272085 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272107 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272143 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272192 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272209 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272235 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272253 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272509 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272533 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272553 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272603 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272625 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272646 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272911 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272972 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.273229 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.385674 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" path="/var/lib/kubelet/pods/b30ce2c5-2b53-47aa-8470-394dd0d6256a/volumes" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435797 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538576 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538721 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538793 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539059 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.589138 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerID="3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.589202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerDied","Data":"3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9"} Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.590421 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.591521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.592761 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593351 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593424 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593439 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593453 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" exitCode=2 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593504 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.350687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.351996 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.352065 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.352283 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.352203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.352391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.352830 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.353012 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.353813 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.353950 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.453663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.454504 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.454643 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.610267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.106043 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.107565 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165302 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165521 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock" (OuterVolumeSpecName: "var-lock") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165630 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165998 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.166020 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.175398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.267590 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352147 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352328 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.352285225 +0000 UTC m=+456.748339288 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352360 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352839 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: W0308 19:37:13.353434 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353538 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353550 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353628 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.353605841 +0000 UTC m=+456.749659904 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.405972 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.424961 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.441530 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.454075 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.454171 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.454146686 +0000 UTC m=+456.850200719 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.620140 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.621779 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" exitCode=0 Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.621869 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerDied","Data":"d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac"} Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623896 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.685513 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.688766 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.690501 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.691445 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.692045 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.880857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881194 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881519 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882268 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882380 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882401 4885 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352484 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352530 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352620 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:16.35259092 +0000 UTC m=+457.748644973 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353125 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353166 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353326 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:16.353261408 +0000 UTC m=+457.749315461 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.387277 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:37:14 crc kubenswrapper[4885]: W0308 19:37:14.387281 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.387417 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.630492 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.658492 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.659002 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.205810 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.206001 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.332013 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.332161 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.371579 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.371709 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: I0308 19:37:15.380570 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.463416 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.463568 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.334480 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.335434 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.381664 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189af4ddb4a2bba1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,LastTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.648147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b1861bbe578eb14c832d685adadee713d3dcff480bb29f882dc4c33b60aa14a2"} Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.777852 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.778843 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.779797 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.781081 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.781854 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.781948 4885 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.782394 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.831397 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.831853 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832430 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832688 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832890 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832932 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.983114 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Mar 08 19:37:17 crc kubenswrapper[4885]: E0308 19:37:17.384782 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Mar 08 19:37:17 crc kubenswrapper[4885]: I0308 19:37:17.658500 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31"} Mar 08 19:37:17 crc kubenswrapper[4885]: I0308 19:37:17.659212 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:17 crc kubenswrapper[4885]: E0308 19:37:17.659337 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.186375 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Mar 08 19:37:18 crc kubenswrapper[4885]: W0308 19:37:18.558823 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.559015 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.666846 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:19 crc kubenswrapper[4885]: W0308 19:37:19.143141 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:19 crc kubenswrapper[4885]: E0308 19:37:19.143586 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:19 crc kubenswrapper[4885]: I0308 19:37:19.372292 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:19 crc kubenswrapper[4885]: E0308 19:37:19.787535 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Mar 08 19:37:20 crc kubenswrapper[4885]: W0308 19:37:20.881358 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:20 crc kubenswrapper[4885]: E0308 19:37:20.881494 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:21 crc kubenswrapper[4885]: W0308 19:37:21.314048 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:21 crc kubenswrapper[4885]: E0308 19:37:21.314521 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:21 crc kubenswrapper[4885]: W0308 19:37:21.607718 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:21 crc kubenswrapper[4885]: E0308 19:37:21.607831 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:22 crc kubenswrapper[4885]: E0308 19:37:22.724711 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189af4ddb4a2bba1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,LastTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:37:22 crc kubenswrapper[4885]: E0308 19:37:22.988627 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="6.4s" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.367412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.719825 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.719952 4885 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52" exitCode=1 Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.720009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52"} Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.720802 4885 scope.go:117] "RemoveContainer" containerID="544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.722166 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.723901 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367767 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.369297 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.369909 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.405337 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.405390 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: E0308 19:37:25.406048 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.406761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: W0308 19:37:25.432162 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3 WatchSource:0}: Error finding container bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3: Status 404 returned error can't find the container with id bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3 Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.732508 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.732584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bc1d995ca761e047ab6d277b450a90b00ccbe1f56edb4d3d9e8120e33492bff"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.733660 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.734474 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737437 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737452 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: E0308 19:37:25.737797 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.738090 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.738445 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745603 4885 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3" exitCode=0 Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745704 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"55d4f0f71d83df0058dd5dde5c1d8869f5759f2716fa35e41ec4fea667417d68"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f0731730456eb934faecc5c035c665ceda5f622a624ed0338e4ef304852a0178"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.746005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d9577bfc5fa159c908c963144c9b39c1a869e64a4b45f5b11552dbac3be9ab8"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d6a9df53b2f9030019552a672fc0792ac6aca1b82d9b38caa44c6622c202930b"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754567 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754597 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77cb8ad5c208d8a7d84e9642a997abac9e969ab2d1c37dc309ff6c8382e0ec83"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754686 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:28 crc kubenswrapper[4885]: I0308 19:37:28.367431 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.407968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.408315 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.417063 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.471556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.591893 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.606406 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 19:37:31 crc kubenswrapper[4885]: I0308 19:37:31.057963 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.768447 4885 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.862786 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.913874 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.386700 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.802782 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.802821 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.807274 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.809182 4885 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://6d9577bfc5fa159c908c963144c9b39c1a869e64a4b45f5b11552dbac3be9ab8" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.809212 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667619 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667955 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667997 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.807616 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.807640 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.814694 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.153881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.260407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.757249 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.789340 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.153544 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.161593 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.484659 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.641435 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.668117 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.668220 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.832810 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.862544 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.919015 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.991570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.116627 4885 scope.go:117] "RemoveContainer" containerID="664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.140353 4885 scope.go:117] "RemoveContainer" containerID="9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.142289 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.159874 4885 scope.go:117] "RemoveContainer" containerID="76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.170048 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.192407 4885 scope.go:117] "RemoveContainer" containerID="f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.221242 4885 scope.go:117] "RemoveContainer" containerID="89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.292937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.311800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.528456 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.622278 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.685109 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.701573 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.725951 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.815635 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.968532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.027391 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.110809 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.331420 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.405334 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.466876 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.479637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.480472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.512537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.570591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.670272 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.725230 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.791385 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.898214 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.898299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.910865 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.947304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.961668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.962025 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.011870 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.020177 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.103381 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.178024 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.287477 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.306742 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.324822 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.372843 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.448164 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.476006 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.527873 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.578783 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.715630 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.750672 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.764280 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.770798 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.800034 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.810332 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.821586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.826061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.844312 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.984134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.029173 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.035855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.167333 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.239842 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.251729 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.252858 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.529503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.534854 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.536578 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.556464 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.706911 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.989663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.997395 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.022998 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.135794 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.143051 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.252016 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.278034 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.327389 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.348306 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.353214 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.373091 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.406399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.457141 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.548354 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.578068 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.593534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.640202 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.652975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.705407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.780468 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.789841 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.816036 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.909968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.913464 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.976128 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.979371 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.982413 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.063161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.152275 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.256248 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.265751 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.374364 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.418630 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.471590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.476893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.488061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.504690 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.506479 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.636991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.715520 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.814886 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.818626 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.855967 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.883402 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.954517 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.060783 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.065607 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.163178 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.228465 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.254064 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.260000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.466127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.478161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.527527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.540510 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.549338 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.620491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.645052 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.663551 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.707220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.749081 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.846750 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.850595 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.877161 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.888070 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.919164 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.940779 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.965626 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.020290 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.064984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.090273 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.116535 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.124987 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.146214 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.166456 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.211475 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.280702 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.311281 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.311699 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.449075 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.497794 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.615167 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.617570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.636380 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.841706 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.904579 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.958822 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.970075 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.081519 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.092894 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.221637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.232447 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.284078 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.380506 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.413439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.428434 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.497820 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.553560 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.581238 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.630479 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.769597 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.825704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.874199 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.894121 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.906951 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.087470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.159411 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.197052 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.372008 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.406221 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.572819 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.627378 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.674134 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.683204 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.711684 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.730387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.732637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.847648 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.889524 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.890826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.950112 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.960758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.989766 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.037870 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.045843 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.368858 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.458065 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.468005 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.579253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.590395 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.594964 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.598476 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.745209 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.776462 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.794332 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.817913 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.840983 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.895352 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.018852 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.058952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.083408 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.171998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.251696 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.398851 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.427319 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.561624 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.723404 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.757345 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.807369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.834966 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.902132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.902159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.980769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.097559 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.250793 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.258295 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.258371 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.272741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.321185 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.326961 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.326942073 podStartE2EDuration="25.326942073s" podCreationTimestamp="2026-03-08 19:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:57.294277364 +0000 UTC m=+378.690331427" watchObservedRunningTime="2026-03-08 19:37:57.326942073 +0000 UTC m=+378.722996126" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.843113 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.886678 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.904775 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.905881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.029127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.420583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.602208 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.643718 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.645765 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.714682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.755769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.912891 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.057029 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.103717 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.394044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.530841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.975656 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 19:38:01 crc kubenswrapper[4885]: I0308 19:38:01.098609 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.447879 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:03 crc kubenswrapper[4885]: E0308 19:38:03.448293 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448303 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448388 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448716 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450120 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450203 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450256 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.455651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.544391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.645501 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.693680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.761209 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:04 crc kubenswrapper[4885]: I0308 19:38:04.193906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:04 crc kubenswrapper[4885]: W0308 19:38:04.199356 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a3a17a_ffa4_4d31_94fb_7e720297e94b.slice/crio-39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716 WatchSource:0}: Error finding container 39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716: Status 404 returned error can't find the container with id 39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716 Mar 08 19:38:05 crc kubenswrapper[4885]: I0308 19:38:05.009887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerStarted","Data":"39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716"} Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.017621 4885 generic.go:334] "Generic (PLEG): container finished" podID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerID="2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03" exitCode=0 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.017684 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerDied","Data":"2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03"} Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.234829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.235267 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" containerID="cri-o://b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.252533 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.252886 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" containerID="cri-o://246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.265889 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.266312 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" containerID="cri-o://598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.271710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.272080 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" containerID="cri-o://2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.287521 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.287826 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" containerID="cri-o://ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.291294 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.291553 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-prdq9" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" containerID="cri-o://cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.300094 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.300836 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.314676 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378117 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.481160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.485131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.499083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.773418 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.773718 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" gracePeriod=5 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.790172 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.794088 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.829425 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.829865 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.830600 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.830641 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.876499 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.881235 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.886682 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.891356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.910104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f" (OuterVolumeSpecName: "kube-api-access-pqf5f") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "kube-api-access-pqf5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.910693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.919759 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.922645 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989245 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989282 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989906 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989940 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989958 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989976 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990235 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990247 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities" (OuterVolumeSpecName: "utilities") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990257 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities" (OuterVolumeSpecName: "utilities") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.991697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities" (OuterVolumeSpecName: "utilities") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.992338 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k" (OuterVolumeSpecName: "kube-api-access-gs66k") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "kube-api-access-gs66k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.993488 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities" (OuterVolumeSpecName: "utilities") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.995120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v" (OuterVolumeSpecName: "kube-api-access-8cp8v") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "kube-api-access-8cp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.005360 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9" (OuterVolumeSpecName: "kube-api-access-st8f9") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "kube-api-access-st8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.005441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq" (OuterVolumeSpecName: "kube-api-access-j5wpq") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "kube-api-access-j5wpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.012334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities" (OuterVolumeSpecName: "utilities") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.017025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh" (OuterVolumeSpecName: "kube-api-access-x76vh") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "kube-api-access-x76vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.026883 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.026998 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027118 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027160 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"de0e60604d3aa86bafd041642af24e9211dfd9322182b13ae9a6b56c608e4e2c"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027177 4885 scope.go:117] "RemoveContainer" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032092 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032163 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032225 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036259 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036330 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.041551 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.041947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.042036 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"731327ad0ac3fd48c5dcf825c4aabc506f0114149e811eabdfb465d917e7e122"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.042895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.045724 4885 generic.go:334] "Generic (PLEG): container finished" podID="83de4c2d-767a-4635-8748-486dd45683a1" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.045869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerDied","Data":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.046229 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerDied","Data":"2ea70402b3dbdea12ac7aa07af023bd9134877c1cdbc64f413fbc103681b29c0"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.047208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049082 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049242 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"789329547b46208ee5dc38fb335a56f42b713329e6d11c7a30e5d4042c3f9ea3"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.054285 4885 scope.go:117] "RemoveContainer" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.055685 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.067217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092604 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092633 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092643 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092652 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092663 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092673 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092682 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092690 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092698 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092706 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092715 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092724 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.098305 4885 scope.go:117] "RemoveContainer" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.115630 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.123828 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.128284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.154186 4885 scope.go:117] "RemoveContainer" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.157269 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": container with ID starting with 246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a not found: ID does not exist" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.157309 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} err="failed to get container status \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": rpc error: code = NotFound desc = could not find container \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": container with ID starting with 246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.157334 4885 scope.go:117] "RemoveContainer" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.158722 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": container with ID starting with fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b not found: ID does not exist" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.158768 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} err="failed to get container status \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": rpc error: code = NotFound desc = could not find container \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": container with ID starting with fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.158795 4885 scope.go:117] "RemoveContainer" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.159404 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": container with ID starting with c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f not found: ID does not exist" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.159423 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f"} err="failed to get container status \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": rpc error: code = NotFound desc = could not find container \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": container with ID starting with c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.159435 4885 scope.go:117] "RemoveContainer" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.173689 4885 scope.go:117] "RemoveContainer" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.194090 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.209990 4885 scope.go:117] "RemoveContainer" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.215224 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.239447 4885 scope.go:117] "RemoveContainer" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.240047 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": container with ID starting with b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a not found: ID does not exist" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240089 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} err="failed to get container status \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": rpc error: code = NotFound desc = could not find container \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": container with ID starting with b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240142 4885 scope.go:117] "RemoveContainer" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.240586 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": container with ID starting with 06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313 not found: ID does not exist" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240622 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} err="failed to get container status \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": rpc error: code = NotFound desc = could not find container \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": container with ID starting with 06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240648 4885 scope.go:117] "RemoveContainer" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.241202 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": container with ID starting with 87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54 not found: ID does not exist" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.241268 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54"} err="failed to get container status \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": rpc error: code = NotFound desc = could not find container \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": container with ID starting with 87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.241297 4885 scope.go:117] "RemoveContainer" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.259123 4885 scope.go:117] "RemoveContainer" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.274066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.278964 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.280736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.281270 4885 scope.go:117] "RemoveContainer" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.295076 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.295100 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305275 4885 scope.go:117] "RemoveContainer" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.305853 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": container with ID starting with cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80 not found: ID does not exist" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305910 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} err="failed to get container status \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": rpc error: code = NotFound desc = could not find container \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": container with ID starting with cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305955 4885 scope.go:117] "RemoveContainer" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.306428 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": container with ID starting with aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908 not found: ID does not exist" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.306492 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} err="failed to get container status \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": rpc error: code = NotFound desc = could not find container \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": container with ID starting with aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.306507 4885 scope.go:117] "RemoveContainer" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.307028 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": container with ID starting with 6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076 not found: ID does not exist" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.307050 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076"} err="failed to get container status \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": rpc error: code = NotFound desc = could not find container \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": container with ID starting with 6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.307063 4885 scope.go:117] "RemoveContainer" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.319154 4885 scope.go:117] "RemoveContainer" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.338644 4885 scope.go:117] "RemoveContainer" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.360452 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.362956 4885 scope.go:117] "RemoveContainer" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363005 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.363406 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": container with ID starting with 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 not found: ID does not exist" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363441 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} err="failed to get container status \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": rpc error: code = NotFound desc = could not find container \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": container with ID starting with 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363467 4885 scope.go:117] "RemoveContainer" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.363966 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": container with ID starting with 67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de not found: ID does not exist" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.364012 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de"} err="failed to get container status \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": rpc error: code = NotFound desc = could not find container \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": container with ID starting with 67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.364039 4885 scope.go:117] "RemoveContainer" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.366784 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": container with ID starting with b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd not found: ID does not exist" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.366821 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd"} err="failed to get container status \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": rpc error: code = NotFound desc = could not find container \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": container with ID starting with b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.366849 4885 scope.go:117] "RemoveContainer" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.381646 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" path="/var/lib/kubelet/pods/2a6b85b3-0bb1-4199-983f-615a6c932f09/volumes" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.383630 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83de4c2d-767a-4635-8748-486dd45683a1" path="/var/lib/kubelet/pods/83de4c2d-767a-4635-8748-486dd45683a1/volumes" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.396627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.399590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n" (OuterVolumeSpecName: "kube-api-access-ljx6n") pod "71a3a17a-ffa4-4d31-94fb-7e720297e94b" (UID: "71a3a17a-ffa4-4d31-94fb-7e720297e94b"). InnerVolumeSpecName "kube-api-access-ljx6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.439474 4885 scope.go:117] "RemoveContainer" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.440652 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": container with ID starting with 598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01 not found: ID does not exist" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.440699 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} err="failed to get container status \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": rpc error: code = NotFound desc = could not find container \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": container with ID starting with 598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.440736 4885 scope.go:117] "RemoveContainer" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.452758 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.459501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.477225 4885 scope.go:117] "RemoveContainer" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.488684 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.496125 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.499206 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.503384 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.512278 4885 scope.go:117] "RemoveContainer" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.518391 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.519986 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.522548 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.531985 4885 scope.go:117] "RemoveContainer" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.532376 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": container with ID starting with ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c not found: ID does not exist" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532406 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} err="failed to get container status \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": rpc error: code = NotFound desc = could not find container \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": container with ID starting with ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532427 4885 scope.go:117] "RemoveContainer" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.532691 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": container with ID starting with 2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7 not found: ID does not exist" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532711 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7"} err="failed to get container status \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": rpc error: code = NotFound desc = could not find container \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": container with ID starting with 2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532724 4885 scope.go:117] "RemoveContainer" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.533043 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": container with ID starting with 3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2 not found: ID does not exist" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.533071 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2"} err="failed to get container status \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": rpc error: code = NotFound desc = could not find container \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": container with ID starting with 3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2 not found: ID does not exist" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.054737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerDied","Data":"39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.055002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.054895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" event={"ID":"1e87323f-cf50-46ef-8e7c-cccd8a1e3601","Type":"ContainerStarted","Data":"0ccaa614eed33e87604a5ab4986e447e3b6f5d78d3eb03e7441a690f11f6b9be"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061216 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" event={"ID":"1e87323f-cf50-46ef-8e7c-cccd8a1e3601","Type":"ContainerStarted","Data":"95eb3b5189eee966a76657d1747943f212d25aa3f031105fcd582f411077fab9"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.064965 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.092273 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" podStartSLOduration=2.092255352 podStartE2EDuration="2.092255352s" podCreationTimestamp="2026-03-08 19:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:38:08.089541049 +0000 UTC m=+389.485595072" watchObservedRunningTime="2026-03-08 19:38:08.092255352 +0000 UTC m=+389.488309375" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.383479 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" path="/var/lib/kubelet/pods/05666e0b-c4ce-451a-ba67-ddb78866ef54/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.384714 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" path="/var/lib/kubelet/pods/56c146b0-3448-4140-8cf0-8d637f7f22a9/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.386544 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" path="/var/lib/kubelet/pods/7d8fbc68-3714-4fe4-9f62-857c5dc05661/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.389178 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" path="/var/lib/kubelet/pods/8881ba5e-d9d1-42a9-98af-849e72053757/volumes" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.083601 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.083847 4885 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" exitCode=137 Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.369720 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.369802 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.462840 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.462963 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463063 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463094 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463159 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464209 4885 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464247 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464265 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464285 4885 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.474841 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.565848 4885 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092029 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092432 4885 scope.go:117] "RemoveContainer" containerID="bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092522 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.377595 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 08 19:39:02 crc kubenswrapper[4885]: I0308 19:39:02.818266 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:39:02 crc kubenswrapper[4885]: I0308 19:39:02.818998 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.365677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.366578 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.368008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.375434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.468307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.468538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.474132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.772073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.780606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.024132 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jps4r"] Mar 08 19:39:16 crc kubenswrapper[4885]: W0308 19:39:16.047388 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f639c4e_64b8_45e9_bf33_c1d8c376b438.slice/crio-9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288 WatchSource:0}: Error finding container 9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288: Status 404 returned error can't find the container with id 9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288 Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.381755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.382131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.388941 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.389431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.496772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"972832e9972527dceaf81093e0a909c97ff2667d0ccbc2bc33966538d084245b"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"47af6b77aa1702865b3d217075ca59ab36c243e6bb360d65e62c73b372897c70"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fb78f201747e8b6b0fbfb3771649a8e4fa3501fa682b9d3978812012cc07a527"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.569237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.683267 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: W0308 19:39:16.915123 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f WatchSource:0}: Error finding container efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f: Status 404 returned error can't find the container with id efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.513111 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a005623d36a21c04f7a754369f96097597f0f99d9b302fa7f7b7061a8b391370"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.513179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d61969643d68289f2dccff9b4bccdb6b82ba4fee905f240c6b6642ddcda13790"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.515840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"dff68ddffa61ea6d8e83c3a07abf0ab18086e0f31a56d7e2b7b734bf137f28c6"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"744b4698473108a806e0119f4d388c2ce6e22a844ade5daf02a3040ee6078138"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.742894 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jps4r" podStartSLOduration=434.742863298 podStartE2EDuration="7m14.742863298s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:39:17.606521587 +0000 UTC m=+459.002575650" watchObservedRunningTime="2026-03-08 19:39:29.742863298 +0000 UTC m=+471.138917361" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.751592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752004 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752049 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752082 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752102 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752132 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752149 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752177 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752197 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752224 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752242 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752265 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752282 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752297 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752310 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752326 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752336 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752350 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752361 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752374 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752385 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752402 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752413 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752448 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752459 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752477 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752487 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752504 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752514 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752537 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752548 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752558 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752572 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752593 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752602 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752748 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752763 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752775 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752791 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752809 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752821 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.753473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.770565 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867870 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867984 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.868009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.868023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.891798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969584 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969683 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.970392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.971507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.972153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.975685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.976016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.983346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.988323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.074853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.597543 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:30 crc kubenswrapper[4885]: W0308 19:39:30.606356 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cf151e9_2721_48a3_825e_c74a1caa0a76.slice/crio-27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b WatchSource:0}: Error finding container 27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b: Status 404 returned error can't find the container with id 27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.699616 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" event={"ID":"2cf151e9-2721-48a3-825e-c74a1caa0a76","Type":"ContainerStarted","Data":"27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b"} Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.714646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" event={"ID":"2cf151e9-2721-48a3-825e-c74a1caa0a76","Type":"ContainerStarted","Data":"14d1f3da52a97070bff36e9722cd8d71ef20ee20c61a4b32cd602df90312ecfe"} Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.715176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.753501 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" podStartSLOduration=2.753428614 podStartE2EDuration="2.753428614s" podCreationTimestamp="2026-03-08 19:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:39:31.750779623 +0000 UTC m=+473.146833676" watchObservedRunningTime="2026-03-08 19:39:31.753428614 +0000 UTC m=+473.149482667" Mar 08 19:39:32 crc kubenswrapper[4885]: I0308 19:39:32.818068 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:39:32 crc kubenswrapper[4885]: I0308 19:39:32.818399 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.161145 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.162109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.163885 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.183631 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216758 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216818 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318645 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318866 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.319818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.319864 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.346088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.361428 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.363443 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.365801 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.388818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420021 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.491502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521168 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521411 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.522069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.539739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.688912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.717985 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: W0308 19:39:33.726101 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751589b1_c864_424f_9315_13a7d880bcf6.slice/crio-e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4 WatchSource:0}: Error finding container e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4: Status 404 returned error can't find the container with id e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4 Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.938383 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: W0308 19:39:33.962286 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc128cf_2f55_4964_8229_6aa7e1dd9f1e.slice/crio-8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c WatchSource:0}: Error finding container 8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c: Status 404 returned error can't find the container with id 8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.738330 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerID="92283bbdc51675a1f97ec35414854fa5ba662317b0bdeaeb0793f1df4f4bbea0" exitCode=0 Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.739272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerDied","Data":"92283bbdc51675a1f97ec35414854fa5ba662317b0bdeaeb0793f1df4f4bbea0"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.739420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745423 4885 generic.go:334] "Generic (PLEG): container finished" podID="751589b1-c864-424f-9315-13a7d880bcf6" containerID="c92e401a0ca9753fe02c9b9adf3141c04b04648a84aef52488db224f7d2d516c" exitCode=0 Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerDied","Data":"c92e401a0ca9753fe02c9b9adf3141c04b04648a84aef52488db224f7d2d516c"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.577239 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.579793 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.582590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.589615 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749558 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.754563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.755857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.762015 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.763737 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.767875 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.783750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851025 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851341 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851700 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.871709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.920243 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.952797 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.952901 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.953003 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.056676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.056717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.079640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.172565 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:36 crc kubenswrapper[4885]: W0308 19:39:36.184220 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b34f7ab_2ff3_40fd_8a23_82b9ff4536e9.slice/crio-27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e WatchSource:0}: Error finding container 27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e: Status 404 returned error can't find the container with id 27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.380850 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.567129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:36 crc kubenswrapper[4885]: W0308 19:39:36.571419 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2914e8af_92f9_40a3_99ea_a52bfaf31a36.slice/crio-d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2 WatchSource:0}: Error finding container d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2: Status 404 returned error can't find the container with id d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.761803 4885 generic.go:334] "Generic (PLEG): container finished" podID="2914e8af-92f9-40a3-99ea-a52bfaf31a36" containerID="97dffe85983491bd037e407581a64e9ae51f74e2c9fced366c765c47287882f7" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.761876 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerDied","Data":"97dffe85983491bd037e407581a64e9ae51f74e2c9fced366c765c47287882f7"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.762204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerStarted","Data":"d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.765533 4885 generic.go:334] "Generic (PLEG): container finished" podID="751589b1-c864-424f-9315-13a7d880bcf6" containerID="32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.765599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerDied","Data":"32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.769610 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerID="0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.769672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerDied","Data":"0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772237 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9" containerID="6049e9245cb496c144433f4e2fc90a4d6dac1fbaf2d8e3721d7ff19427bd4661" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerDied","Data":"6049e9245cb496c144433f4e2fc90a4d6dac1fbaf2d8e3721d7ff19427bd4661"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772313 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerStarted","Data":"27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.781510 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9" containerID="d9226a8cd45f91a8c02e9b345d84059dcf8c500de7058e4c867cb59dcd4befef" exitCode=0 Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.781597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerDied","Data":"d9226a8cd45f91a8c02e9b345d84059dcf8c500de7058e4c867cb59dcd4befef"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.785976 4885 generic.go:334] "Generic (PLEG): container finished" podID="2914e8af-92f9-40a3-99ea-a52bfaf31a36" containerID="37068f533f46fc4af386aba52810bc9625ff56194945929f07b856f065caeb4b" exitCode=0 Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.786030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerDied","Data":"37068f533f46fc4af386aba52810bc9625ff56194945929f07b856f065caeb4b"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.790426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"332b2321d2cfb4d39de4da7d751c1878a3faece3fa394bd9b703c026b4619029"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.793087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"897e4783123371a6a16ebc1dc821428baff397298d556a655505ce30473ca924"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.851643 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6ctxc" podStartSLOduration=2.43857022 podStartE2EDuration="4.851620207s" podCreationTimestamp="2026-03-08 19:39:33 +0000 UTC" firstStartedPulling="2026-03-08 19:39:34.749154764 +0000 UTC m=+476.145208797" lastFinishedPulling="2026-03-08 19:39:37.162204761 +0000 UTC m=+478.558258784" observedRunningTime="2026-03-08 19:39:37.83400231 +0000 UTC m=+479.230056343" watchObservedRunningTime="2026-03-08 19:39:37.851620207 +0000 UTC m=+479.247674230" Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.853624 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs2qs" podStartSLOduration=2.381635779 podStartE2EDuration="4.85361532s" podCreationTimestamp="2026-03-08 19:39:33 +0000 UTC" firstStartedPulling="2026-03-08 19:39:34.741150752 +0000 UTC m=+476.137204815" lastFinishedPulling="2026-03-08 19:39:37.213130303 +0000 UTC m=+478.609184356" observedRunningTime="2026-03-08 19:39:37.851773091 +0000 UTC m=+479.247827134" watchObservedRunningTime="2026-03-08 19:39:37.85361532 +0000 UTC m=+479.249669353" Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.802132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerStarted","Data":"da8ad3e698a9cc0fcf4d86dc50c169cc94ede477a183a379520f4f7956dd3886"} Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.810258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerStarted","Data":"813d97879d71906ae6bfa2a82a92be63704d49c1e2aedbce45e1bbd8d42db093"} Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.830693 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2tkhk" podStartSLOduration=2.447087582 podStartE2EDuration="3.830679059s" podCreationTimestamp="2026-03-08 19:39:35 +0000 UTC" firstStartedPulling="2026-03-08 19:39:36.775137679 +0000 UTC m=+478.171191732" lastFinishedPulling="2026-03-08 19:39:38.158729176 +0000 UTC m=+479.554783209" observedRunningTime="2026-03-08 19:39:38.827027592 +0000 UTC m=+480.223081615" watchObservedRunningTime="2026-03-08 19:39:38.830679059 +0000 UTC m=+480.226733082" Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.851472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjhsv" podStartSLOduration=2.370454137 podStartE2EDuration="3.85145036s" podCreationTimestamp="2026-03-08 19:39:35 +0000 UTC" firstStartedPulling="2026-03-08 19:39:36.764903858 +0000 UTC m=+478.160957881" lastFinishedPulling="2026-03-08 19:39:38.245900081 +0000 UTC m=+479.641954104" observedRunningTime="2026-03-08 19:39:38.848583794 +0000 UTC m=+480.244637817" watchObservedRunningTime="2026-03-08 19:39:38.85145036 +0000 UTC m=+480.247504393" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.492952 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.493334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.549242 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.689590 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.689667 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.910745 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:44 crc kubenswrapper[4885]: I0308 19:39:44.734691 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2qs" podUID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerName="registry-server" probeResult="failure" output=< Mar 08 19:39:44 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:39:44 crc kubenswrapper[4885]: > Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.427977 4885 scope.go:117] "RemoveContainer" containerID="90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.920794 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.920890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.972181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.381439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.381915 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.440787 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.915381 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.936679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:50 crc kubenswrapper[4885]: I0308 19:39:50.082341 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:50 crc kubenswrapper[4885]: I0308 19:39:50.154156 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:39:53 crc kubenswrapper[4885]: I0308 19:39:53.754098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:53 crc kubenswrapper[4885]: I0308 19:39:53.826184 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:56 crc kubenswrapper[4885]: I0308 19:39:56.690635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.153083 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.154837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.156987 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.157501 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.158533 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.164356 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.225837 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.327656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.361521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.491875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.719897 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.995445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerStarted","Data":"b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8"} Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.854660 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.855323 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.855386 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.856260 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.856383 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" gracePeriod=600 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014421 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" exitCode=0 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014611 4885 scope.go:117] "RemoveContainer" containerID="c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.019630 4885 generic.go:334] "Generic (PLEG): container finished" podID="ffc137d5-821a-406d-8db5-d396d0091991" containerID="6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e" exitCode=0 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.019710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerDied","Data":"6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e"} Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.029871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.355185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.390991 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"ffc137d5-821a-406d-8db5-d396d0091991\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.410036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f" (OuterVolumeSpecName: "kube-api-access-2x59f") pod "ffc137d5-821a-406d-8db5-d396d0091991" (UID: "ffc137d5-821a-406d-8db5-d396d0091991"). InnerVolumeSpecName "kube-api-access-2x59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.493709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerDied","Data":"b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8"} Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042392 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.425479 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.428681 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:40:07 crc kubenswrapper[4885]: I0308 19:40:07.381611 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" path="/var/lib/kubelet/pods/60da1edb-8474-4368-a6ae-0bb2b1b7b845/volumes" Mar 08 19:40:15 crc kubenswrapper[4885]: I0308 19:40:15.210604 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" containerID="cri-o://f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" gracePeriod=30 Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.645399 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667430 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667615 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.668571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.668728 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.674789 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm" (OuterVolumeSpecName: "kube-api-access-mfrhm") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "kube-api-access-mfrhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.675881 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.676140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.676500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.679595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.684803 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769601 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769645 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769664 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769681 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769751 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769773 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769791 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133338 4885 generic.go:334] "Generic (PLEG): container finished" podID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" exitCode=0 Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133404 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerDied","Data":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133448 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133504 4885 scope.go:117] "RemoveContainer" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerDied","Data":"d5dd902e3ef717231619c64b1b91e79b07a9f0b3233c92d0567cafca72b99c09"} Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.168073 4885 scope.go:117] "RemoveContainer" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: E0308 19:40:16.168794 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": container with ID starting with f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c not found: ID does not exist" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.169164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} err="failed to get container status \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": rpc error: code = NotFound desc = could not find container \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": container with ID starting with f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c not found: ID does not exist" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.189302 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.204907 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:40:17 crc kubenswrapper[4885]: I0308 19:40:17.380162 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" path="/var/lib/kubelet/pods/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1/volumes" Mar 08 19:41:45 crc kubenswrapper[4885]: I0308 19:41:45.503624 4885 scope.go:117] "RemoveContainer" containerID="67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.148051 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: E0308 19:42:00.149661 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: E0308 19:42:00.149719 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149732 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149888 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149958 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.151499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.154044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.154748 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.156519 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.160632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.338022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.439142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.471951 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.486496 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.718352 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.725327 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.863764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerStarted","Data":"6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b"} Mar 08 19:42:02 crc kubenswrapper[4885]: I0308 19:42:02.888123 4885 generic.go:334] "Generic (PLEG): container finished" podID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerID="c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa" exitCode=0 Mar 08 19:42:02 crc kubenswrapper[4885]: I0308 19:42:02.888323 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerDied","Data":"c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa"} Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.186744 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.300262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"30a90e18-1089-40ae-a5f0-f43b1d252129\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.305624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv" (OuterVolumeSpecName: "kube-api-access-8ndnv") pod "30a90e18-1089-40ae-a5f0-f43b1d252129" (UID: "30a90e18-1089-40ae-a5f0-f43b1d252129"). InnerVolumeSpecName "kube-api-access-8ndnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.402215 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") on node \"crc\" DevicePath \"\"" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904767 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerDied","Data":"6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b"} Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904807 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904856 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.263182 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.269021 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.385890 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" path="/var/lib/kubelet/pods/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58/volumes" Mar 08 19:42:32 crc kubenswrapper[4885]: I0308 19:42:32.818032 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:42:32 crc kubenswrapper[4885]: I0308 19:42:32.818702 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:42:45 crc kubenswrapper[4885]: I0308 19:42:45.574285 4885 scope.go:117] "RemoveContainer" containerID="0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff" Mar 08 19:42:45 crc kubenswrapper[4885]: I0308 19:42:45.640552 4885 scope.go:117] "RemoveContainer" containerID="7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" Mar 08 19:43:02 crc kubenswrapper[4885]: I0308 19:43:02.818684 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:43:02 crc kubenswrapper[4885]: I0308 19:43:02.819493 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.818795 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.819661 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.819782 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.821748 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.822002 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" gracePeriod=600 Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.508626 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" exitCode=0 Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.508839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.509241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.509280 4885 scope.go:117] "RemoveContainer" containerID="7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.147410 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:00 crc kubenswrapper[4885]: E0308 19:44:00.148424 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.148442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.148586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.149080 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.151491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.151611 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.152714 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.158273 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.257690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.360687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.400001 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.474433 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.941660 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:01 crc kubenswrapper[4885]: I0308 19:44:01.713473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerStarted","Data":"94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6"} Mar 08 19:44:02 crc kubenswrapper[4885]: E0308 19:44:02.500178 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod383ac947_e1b1_4f15_98a6_69fcc60e0ac1.slice/crio-b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod383ac947_e1b1_4f15_98a6_69fcc60e0ac1.slice/crio-conmon-b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:44:02 crc kubenswrapper[4885]: I0308 19:44:02.724505 4885 generic.go:334] "Generic (PLEG): container finished" podID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerID="b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c" exitCode=0 Mar 08 19:44:02 crc kubenswrapper[4885]: I0308 19:44:02.724575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerDied","Data":"b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c"} Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.051582 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.215093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.223731 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg" (OuterVolumeSpecName: "kube-api-access-mjzrg") pod "383ac947-e1b1-4f15-98a6-69fcc60e0ac1" (UID: "383ac947-e1b1-4f15-98a6-69fcc60e0ac1"). InnerVolumeSpecName "kube-api-access-mjzrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.316631 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") on node \"crc\" DevicePath \"\"" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerDied","Data":"94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6"} Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739547 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739567 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.139136 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.141947 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.376216 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" path="/var/lib/kubelet/pods/71a3a17a-ffa4-4d31-94fb-7e720297e94b/volumes" Mar 08 19:44:45 crc kubenswrapper[4885]: I0308 19:44:45.722629 4885 scope.go:117] "RemoveContainer" containerID="2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.558532 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:58 crc kubenswrapper[4885]: E0308 19:44:58.560020 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.560046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.560227 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.561577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.566144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.566484 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.567438 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.570590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.572276 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.729796 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.729961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.730009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.832677 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.864510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.891298 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:59 crc kubenswrapper[4885]: I0308 19:44:59.159475 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:59 crc kubenswrapper[4885]: W0308 19:44:59.168186 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3c0554_f8ec_4a68_a332_1eba738b28c6.slice/crio-a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015 WatchSource:0}: Error finding container a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015: Status 404 returned error can't find the container with id a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015 Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.114540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerStarted","Data":"a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015"} Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.143290 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.144124 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.147593 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.148955 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149879 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.167576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.250856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.250980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.251020 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.252547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.269827 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.293578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.477858 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.971164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:01 crc kubenswrapper[4885]: W0308 19:45:01.010814 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8673a65_b7c8_4c06_9713_a095b399358a.slice/crio-c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b WatchSource:0}: Error finding container c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b: Status 404 returned error can't find the container with id c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.127889 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerID="b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8" exitCode=0 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.127996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerDied","Data":"b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8"} Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.129361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerStarted","Data":"c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b"} Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.590886 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591662 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" containerID="cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591725 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" containerID="cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591948 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" containerID="cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" containerID="cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592115 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" containerID="cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592065 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" containerID="cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592162 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.646215 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" containerID="cri-o://9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" gracePeriod=30 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.138479 4885 generic.go:334] "Generic (PLEG): container finished" podID="f8673a65-b7c8-4c06-9713-a095b399358a" containerID="dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.138647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerDied","Data":"dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.143284 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.148074 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.149315 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150023 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150073 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150091 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150115 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150134 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150152 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150222 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150312 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150326 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150173 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" exitCode=143 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150451 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" exitCode=143 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.157982 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.159507 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.159900 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.160658 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" exitCode=2 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.165410 4885 scope.go:117] "RemoveContainer" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.165989 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4)\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.189187 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.238014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.367831 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.368598 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.369290 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.383766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.383911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384062 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384431 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.391081 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv" (OuterVolumeSpecName: "kube-api-access-966qv") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "kube-api-access-966qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.414266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436649 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rbdmt"] Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436907 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436948 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436959 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436967 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436976 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436984 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436998 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kubecfg-setup" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kubecfg-setup" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437018 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437026 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437037 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437045 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437054 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437076 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437085 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437096 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437103 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437114 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437144 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437154 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437311 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437323 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437333 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437343 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437352 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437365 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437374 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437382 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437393 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437402 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437413 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437518 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437649 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437752 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437762 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.439666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485623 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485768 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485998 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket" (OuterVolumeSpecName: "log-socket") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486362 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486466 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486719 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log" (OuterVolumeSpecName: "node-log") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487536 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487584 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487739 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487807 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash" (OuterVolumeSpecName: "host-slash") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487844 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487888 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488106 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489395 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489514 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490073 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490154 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490273 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490343 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490411 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490479 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490553 4885 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490636 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490705 4885 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490774 4885 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490843 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490943 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491043 4885 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491123 4885 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491193 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491268 4885 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491346 4885 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491415 4885 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491500 4885 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491647 4885 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491726 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490742 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.492396 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt" (OuterVolumeSpecName: "kube-api-access-5mlvt") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "kube-api-access-5mlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.511650 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593180 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593315 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593492 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593714 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593744 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593805 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593849 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593954 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594022 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594045 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594065 4885 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595615 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595984 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596314 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.601479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.627044 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.756282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: W0308 19:45:02.789634 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003d9fa9_c0c3_4af8_bdb0_084620b29ae0.slice/crio-ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555 WatchSource:0}: Error finding container ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555: Status 404 returned error can't find the container with id ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555 Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.171288 4885 generic.go:334] "Generic (PLEG): container finished" podID="003d9fa9-c0c3-4af8-bdb0-084620b29ae0" containerID="2eb629dd73ff88d9831cd476b65ccd4903a995f4f9b25e669a7928d96bd36e93" exitCode=0 Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.173212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerDied","Data":"2eb629dd73ff88d9831cd476b65ccd4903a995f4f9b25e669a7928d96bd36e93"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.173440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.178042 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.179903 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"54661ff92d86d446f0561f70be37d97fffc952cd7edc4f3f4e212f70264f4183"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180658 4885 scope.go:117] "RemoveContainer" containerID="9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerDied","Data":"a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184526 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184528 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.190515 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.214314 4885 scope.go:117] "RemoveContainer" containerID="fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.248199 4885 scope.go:117] "RemoveContainer" containerID="1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.249718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.256958 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.279488 4885 scope.go:117] "RemoveContainer" containerID="379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.280445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.310860 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp" (OuterVolumeSpecName: "kube-api-access-6tdlp") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "kube-api-access-6tdlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.311430 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.317850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.363120 4885 scope.go:117] "RemoveContainer" containerID="ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.376000 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" path="/var/lib/kubelet/pods/dedec2a4-d864-4f30-8a2d-b3168817ea34/volumes" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.379284 4885 scope.go:117] "RemoveContainer" containerID="409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.393965 4885 scope.go:117] "RemoveContainer" containerID="9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408140 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408168 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408182 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.427739 4885 scope.go:117] "RemoveContainer" containerID="f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.451111 4885 scope.go:117] "RemoveContainer" containerID="ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e" Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"e8f3eb63d0193da86832075983e64d83e02a6d9c9f4aeb44f8631f8ffd3988ad"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"0891820ae75278e337fd3cc610e870e88fee31edc84e52dbed049bf24b17cacf"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"73f370b34d05e9b11ea5ea3cf54e2bf119977d2c88048fd57601e3135cce7a88"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351571 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"cf61f26d04d904050a733c6b1a9569445872afce269be85469582bf7f7e99351"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"5185e11c0691c4782a4c0ff035ead7be04829de0c8dce43b90328cce5133e774"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"8614aa3e5915b1864cd05579f93795d5c9d9fe400c04ae5562d37b8fb3e57dca"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerDied","Data":"c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353544 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b" Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353607 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:06 crc kubenswrapper[4885]: I0308 19:45:06.372993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"427ded321330b2ecdaf11f8ec9439d128083d97f20ecd6bedcbc5c34c31bd54f"} Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.395954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"7e3f56d51aaeefec1253d6bb79a15a4262dc2f53f1f13c78f069a85fe97dfd47"} Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396492 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396523 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.429137 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.436444 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.441422 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" podStartSLOduration=7.441403525 podStartE2EDuration="7.441403525s" podCreationTimestamp="2026-03-08 19:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:45:09.438181639 +0000 UTC m=+810.834235672" watchObservedRunningTime="2026-03-08 19:45:09.441403525 +0000 UTC m=+810.837457548" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.890731 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:10 crc kubenswrapper[4885]: E0308 19:45:10.891483 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.891505 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.891671 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.893081 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.903863 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.904269 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.003955 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.004007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.004058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104939 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.105747 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.106654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.125996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.212427 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249094 4885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249253 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249320 4885 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249429 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.408724 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.409835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.441874 4885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442229 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442256 4885 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442312 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" Mar 08 19:45:13 crc kubenswrapper[4885]: I0308 19:45:13.368390 4885 scope.go:117] "RemoveContainer" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" Mar 08 19:45:14 crc kubenswrapper[4885]: I0308 19:45:14.429164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:14 crc kubenswrapper[4885]: I0308 19:45:14.429419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"4b86a03c5c9a206f4ea2240f652d8bef1ba716966723ee8fc5b64f7d118486a7"} Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.367641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.368965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.700647 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.493525 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="86f9b94c866e293c0097cd26686dd912dc8dd9a05680cd99bc3c03cd64187c52" exitCode=0 Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.493623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"86f9b94c866e293c0097cd26686dd912dc8dd9a05680cd99bc3c03cd64187c52"} Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.494004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerStarted","Data":"0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91"} Mar 08 19:45:26 crc kubenswrapper[4885]: I0308 19:45:26.514739 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="9786fb7b6a4f4d2ecb4e516099038ab04c52aa4d7b996237f7371a71c20dedb8" exitCode=0 Mar 08 19:45:26 crc kubenswrapper[4885]: I0308 19:45:26.514909 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"9786fb7b6a4f4d2ecb4e516099038ab04c52aa4d7b996237f7371a71c20dedb8"} Mar 08 19:45:27 crc kubenswrapper[4885]: I0308 19:45:27.527076 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="699eeb7926e7079f8572a448098dff2cc1160bab7c53f37413cb5b08942c6c7e" exitCode=0 Mar 08 19:45:27 crc kubenswrapper[4885]: I0308 19:45:27.527138 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"699eeb7926e7079f8572a448098dff2cc1160bab7c53f37413cb5b08942c6c7e"} Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.826668 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900059 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900156 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.901587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle" (OuterVolumeSpecName: "bundle") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.905692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp" (OuterVolumeSpecName: "kube-api-access-xhgwp") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "kube-api-access-xhgwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.912779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util" (OuterVolumeSpecName: "util") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001831 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001869 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.544904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91"} Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.544984 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.545062 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:30 crc kubenswrapper[4885]: I0308 19:45:30.443141 4885 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.612695 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613017 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="util" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613035 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="util" Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613052 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="pull" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="pull" Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613085 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613096 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613805 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.615857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.616147 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rn8k8" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.616584 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.630695 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.776573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.783503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.878128 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.901290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.930733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:33 crc kubenswrapper[4885]: I0308 19:45:33.166512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:33 crc kubenswrapper[4885]: W0308 19:45:33.173145 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d2b43e_55f4_49f1_9bb1_3e70ed22a3da.slice/crio-4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67 WatchSource:0}: Error finding container 4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67: Status 404 returned error can't find the container with id 4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67 Mar 08 19:45:33 crc kubenswrapper[4885]: I0308 19:45:33.567645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" event={"ID":"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da","Type":"ContainerStarted","Data":"4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67"} Mar 08 19:45:36 crc kubenswrapper[4885]: I0308 19:45:36.589901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" event={"ID":"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da","Type":"ContainerStarted","Data":"ad29a575fad32bb21e1014d36c3e961dfaf8b0fc12bb6247f70e49cb2f02cd9f"} Mar 08 19:45:36 crc kubenswrapper[4885]: I0308 19:45:36.621778 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" podStartSLOduration=2.282291635 podStartE2EDuration="4.621753366s" podCreationTimestamp="2026-03-08 19:45:32 +0000 UTC" firstStartedPulling="2026-03-08 19:45:33.175623767 +0000 UTC m=+834.571677800" lastFinishedPulling="2026-03-08 19:45:35.515085498 +0000 UTC m=+836.911139531" observedRunningTime="2026-03-08 19:45:36.621716955 +0000 UTC m=+838.017771008" watchObservedRunningTime="2026-03-08 19:45:36.621753366 +0000 UTC m=+838.017807419" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.345021 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.346905 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.357818 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8pvzf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.378248 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.379050 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.383125 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.383379 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6m2b5"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.384388 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.392165 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.402603 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.486265 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487069 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487449 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487539 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487674 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487699 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489096 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bcjwg" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489514 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.496043 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588500 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588792 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589249 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589349 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.602568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.610757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.624787 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.637912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.670605 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.671226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.671408 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.691337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.695894 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.698699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.700371 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.710105 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.717914 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792488 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.800442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.854278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: W0308 19:45:41.862466 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f588d1_7159_4a94_bf89_bb18a880a403.slice/crio-e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038 WatchSource:0}: Error finding container e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038: Status 404 returned error can't find the container with id e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038 Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893725 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893754 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.894540 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.894608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.895450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.895624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.897662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.906681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.908223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.976441 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: W0308 19:45:41.981066 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc548cbba_61a5_4167_b494_f57c45b1599b.slice/crio-c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7 WatchSource:0}: Error finding container c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7: Status 404 returned error can't find the container with id c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7 Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.028637 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.160443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:42 crc kubenswrapper[4885]: W0308 19:45:42.181033 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3793d26a_a132_40db_b8fe_2cf83428b03c.slice/crio-54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8 WatchSource:0}: Error finding container 54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8: Status 404 returned error can't find the container with id 54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8 Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.277690 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:42 crc kubenswrapper[4885]: W0308 19:45:42.282466 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee9a38c_b44c_44d6_ad0f_8c19401cc08d.slice/crio-a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea WatchSource:0}: Error finding container a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea: Status 404 returned error can't find the container with id a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.629710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6m2b5" event={"ID":"74d96fe5-1ab9-4703-8717-509cf115d985","Type":"ContainerStarted","Data":"b5bc75ef30c06b536e813b54e231d1bbf2490f82f7b221eceef19ac55f58b961"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.631815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" event={"ID":"c548cbba-61a5-4167-b494-f57c45b1599b","Type":"ContainerStarted","Data":"c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.632827 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" event={"ID":"3793d26a-a132-40db-b8fe-2cf83428b03c","Type":"ContainerStarted","Data":"54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.634650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b996dd7c9-w96bf" event={"ID":"dee9a38c-b44c-44d6-ad0f-8c19401cc08d","Type":"ContainerStarted","Data":"1bb2b1328439a08fc069609b03ebe18cdfa94ae68311eaba79b0162f2876367f"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.634681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b996dd7c9-w96bf" event={"ID":"dee9a38c-b44c-44d6-ad0f-8c19401cc08d","Type":"ContainerStarted","Data":"a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.637202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.663752 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b996dd7c9-w96bf" podStartSLOduration=1.663730787 podStartE2EDuration="1.663730787s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:45:42.66272401 +0000 UTC m=+844.058778103" watchObservedRunningTime="2026-03-08 19:45:42.663730787 +0000 UTC m=+844.059784820" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.665685 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" event={"ID":"3793d26a-a132-40db-b8fe-2cf83428b03c","Type":"ContainerStarted","Data":"430cc1d3eb031f5b47880645cef52539e7fddbb6ce93d01366ee79bb4f939f48"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.666438 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.668620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"2bbaec23d9383c386a763fccf911aa294d786ba762625b7c6fc60a23f38cd74f"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.671012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6m2b5" event={"ID":"74d96fe5-1ab9-4703-8717-509cf115d985","Type":"ContainerStarted","Data":"367aad91784a6b28859a3e295f9c5073a2183ead48e88337ffd1746429885746"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.671136 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.674822 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" event={"ID":"c548cbba-61a5-4167-b494-f57c45b1599b","Type":"ContainerStarted","Data":"9f94088f6bd2cd278c10c399b0638b3864e723d5c90146f00f4c6887bed59e25"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.796470 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" podStartSLOduration=1.75861596 podStartE2EDuration="4.796451609s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:42.182959448 +0000 UTC m=+843.579013481" lastFinishedPulling="2026-03-08 19:45:45.220795107 +0000 UTC m=+846.616849130" observedRunningTime="2026-03-08 19:45:45.713095441 +0000 UTC m=+847.109149514" watchObservedRunningTime="2026-03-08 19:45:45.796451609 +0000 UTC m=+847.192505642" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.797030 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" podStartSLOduration=1.565931415 podStartE2EDuration="4.797019154s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.983006949 +0000 UTC m=+843.379060972" lastFinishedPulling="2026-03-08 19:45:45.214094678 +0000 UTC m=+846.610148711" observedRunningTime="2026-03-08 19:45:45.79535223 +0000 UTC m=+847.191406263" watchObservedRunningTime="2026-03-08 19:45:45.797019154 +0000 UTC m=+847.193073187" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.827443 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6m2b5" podStartSLOduration=1.368245467 podStartE2EDuration="4.827419783s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.760682065 +0000 UTC m=+843.156736088" lastFinishedPulling="2026-03-08 19:45:45.219856391 +0000 UTC m=+846.615910404" observedRunningTime="2026-03-08 19:45:45.823348324 +0000 UTC m=+847.219402357" watchObservedRunningTime="2026-03-08 19:45:45.827419783 +0000 UTC m=+847.223473826" Mar 08 19:45:48 crc kubenswrapper[4885]: I0308 19:45:48.697568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"cdf0d00bccdf86ac023c9a8807db048987db440d45a1dd3d7958509fee8a5ebe"} Mar 08 19:45:48 crc kubenswrapper[4885]: I0308 19:45:48.724377 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" podStartSLOduration=1.171126853 podStartE2EDuration="7.724336902s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.866075039 +0000 UTC m=+843.262129062" lastFinishedPulling="2026-03-08 19:45:48.419285088 +0000 UTC m=+849.815339111" observedRunningTime="2026-03-08 19:45:48.719537255 +0000 UTC m=+850.115591358" watchObservedRunningTime="2026-03-08 19:45:48.724336902 +0000 UTC m=+850.120390965" Mar 08 19:45:51 crc kubenswrapper[4885]: I0308 19:45:51.752975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.028999 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.029076 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.036598 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.737760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.838559 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.142038 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.143965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.148619 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.151754 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.152157 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.153155 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.158078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.260008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.295642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.475283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.698255 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: W0308 19:46:00.705262 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa96ded3_40b5_4e54_9f54_72f64edfb672.slice/crio-a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f WatchSource:0}: Error finding container a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f: Status 404 returned error can't find the container with id a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.785168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerStarted","Data":"a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f"} Mar 08 19:46:01 crc kubenswrapper[4885]: I0308 19:46:01.710759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.801615 4885 generic.go:334] "Generic (PLEG): container finished" podID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerID="dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb" exitCode=0 Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.801779 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerDied","Data":"dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb"} Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.819855 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.820044 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.148677 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.335399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"fa96ded3-40b5-4e54-9f54-72f64edfb672\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.345118 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr" (OuterVolumeSpecName: "kube-api-access-4f7zr") pod "fa96ded3-40b5-4e54-9f54-72f64edfb672" (UID: "fa96ded3-40b5-4e54-9f54-72f64edfb672"). InnerVolumeSpecName "kube-api-access-4f7zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.437330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerDied","Data":"a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f"} Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821115 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821273 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.209484 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.224905 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.385768 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc137d5-821a-406d-8db5-d396d0091991" path="/var/lib/kubelet/pods/ffc137d5-821a-406d-8db5-d396d0091991/volumes" Mar 08 19:46:17 crc kubenswrapper[4885]: I0308 19:46:17.890584 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" containerID="cri-o://72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" gracePeriod=15 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239162 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: E0308 19:46:19.239580 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239610 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239870 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.241438 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.252272 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277095 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.281501 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.381523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.385121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.411323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.516047 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hsdmw_bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/console/0.log" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.516110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.583421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.584193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config" (OuterVolumeSpecName: "console-config") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.584159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585818 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585860 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586644 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587130 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587850 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587873 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587889 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca" (OuterVolumeSpecName: "service-ca") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.590601 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.591109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.592348 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s" (OuterVolumeSpecName: "kube-api-access-hc92s") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "kube-api-access-hc92s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.640088 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691263 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691320 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691339 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691357 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.881746 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: W0308 19:46:19.890259 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b43d8cc_1dca_4c13_a0b7_df1371935186.slice/crio-443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3 WatchSource:0}: Error finding container 443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3: Status 404 returned error can't find the container with id 443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942369 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hsdmw_bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/console/0.log" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942500 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" exitCode=2 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerDied","Data":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerDied","Data":"3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942855 4885 scope.go:117] "RemoveContainer" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.943042 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.948852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerStarted","Data":"443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.988787 4885 scope.go:117] "RemoveContainer" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: E0308 19:46:19.989770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": container with ID starting with 72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2 not found: ID does not exist" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.989852 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} err="failed to get container status \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": rpc error: code = NotFound desc = could not find container \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": container with ID starting with 72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2 not found: ID does not exist" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.994852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.999076 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:20 crc kubenswrapper[4885]: I0308 19:46:20.960588 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="c9063d91d4348c09456bc8e3a036e05fb6eaf9aa2d95dc039e13087e50453059" exitCode=0 Mar 08 19:46:20 crc kubenswrapper[4885]: I0308 19:46:20.960675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"c9063d91d4348c09456bc8e3a036e05fb6eaf9aa2d95dc039e13087e50453059"} Mar 08 19:46:21 crc kubenswrapper[4885]: I0308 19:46:21.380452 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" path="/var/lib/kubelet/pods/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/volumes" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.772832 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:22 crc kubenswrapper[4885]: E0308 19:46:22.775029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.775054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.775228 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.777884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.782578 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834883 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.937082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.937410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.956074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.974732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerStarted","Data":"fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.173485 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.373806 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:23 crc kubenswrapper[4885]: W0308 19:46:23.378485 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ddb731_1c00_4dea_80e3_9c2d372916e9.slice/crio-09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801 WatchSource:0}: Error finding container 09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801: Status 404 returned error can't find the container with id 09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982051 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" exitCode=0 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982723 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.986265 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c" exitCode=0 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.986321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c"} Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:24.999453 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="b0d3a913135940ae2cfeb85521a6d631a45734d9847610f1c13fa9dfe3a8cf22" exitCode=0 Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:24.999585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"b0d3a913135940ae2cfeb85521a6d631a45734d9847610f1c13fa9dfe3a8cf22"} Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:25.002523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.013334 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" exitCode=0 Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.013462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.346816 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385272 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.387492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle" (OuterVolumeSpecName: "bundle") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.403033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util" (OuterVolumeSpecName: "util") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.404610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk" (OuterVolumeSpecName: "kube-api-access-67ntk") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "kube-api-access-67ntk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487675 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487721 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487741 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.024692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3"} Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027111 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.057250 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p2l28" podStartSLOduration=2.573256943 podStartE2EDuration="5.057227828s" podCreationTimestamp="2026-03-08 19:46:22 +0000 UTC" firstStartedPulling="2026-03-08 19:46:23.984729007 +0000 UTC m=+885.380783030" lastFinishedPulling="2026-03-08 19:46:26.468699882 +0000 UTC m=+887.864753915" observedRunningTime="2026-03-08 19:46:27.04640483 +0000 UTC m=+888.442458863" watchObservedRunningTime="2026-03-08 19:46:27.057227828 +0000 UTC m=+888.453281841" Mar 08 19:46:32 crc kubenswrapper[4885]: I0308 19:46:32.819193 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:46:32 crc kubenswrapper[4885]: I0308 19:46:32.819660 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:46:33 crc kubenswrapper[4885]: I0308 19:46:33.174361 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:33 crc kubenswrapper[4885]: I0308 19:46:33.174432 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:34 crc kubenswrapper[4885]: I0308 19:46:34.236737 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p2l28" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" probeResult="failure" output=< Mar 08 19:46:34 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:46:34 crc kubenswrapper[4885]: > Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.619664 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620171 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620186 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="pull" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620212 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="pull" Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620238 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="util" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620247 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="util" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620790 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.622732 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.623986 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.624000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-p2cn5" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.623988 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.629068 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.630015 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.763014 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.766807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.771662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.853087 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.853914 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.856110 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nnjzn" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.856328 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.858328 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.870541 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.938940 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.958914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.959290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.959312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061464 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.066759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.067036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.092172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.167793 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.345841 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:39 crc kubenswrapper[4885]: W0308 19:46:39.356642 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca51bb10_b38d_4e58_9d29_6c6b8922f72e.slice/crio-4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c WatchSource:0}: Error finding container 4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c: Status 404 returned error can't find the container with id 4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.386277 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:40 crc kubenswrapper[4885]: I0308 19:46:40.106358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" event={"ID":"6ea4545b-278f-43ff-be3c-fc1346b591a1","Type":"ContainerStarted","Data":"5a5644454e0cb65ab10a1601abfaf42ac487a0421086e15d6f95505a84a8b741"} Mar 08 19:46:40 crc kubenswrapper[4885]: I0308 19:46:40.107980 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" event={"ID":"ca51bb10-b38d-4e58-9d29-6c6b8922f72e","Type":"ContainerStarted","Data":"4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c"} Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.246498 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.302337 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.957316 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.139634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" event={"ID":"6ea4545b-278f-43ff-be3c-fc1346b591a1","Type":"ContainerStarted","Data":"fa61c5941d266f0ede484cbeb0b888dd23222a5f02b756428a17dfbec0c8dadd"} Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.140128 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" event={"ID":"ca51bb10-b38d-4e58-9d29-6c6b8922f72e","Type":"ContainerStarted","Data":"c5a21ccc6282e89f02c992eb21583597058be56faa57567ba98547a14429420c"} Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145535 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p2l28" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" containerID="cri-o://9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" gracePeriod=2 Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.167898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" podStartSLOduration=2.376800235 podStartE2EDuration="7.16786636s" podCreationTimestamp="2026-03-08 19:46:38 +0000 UTC" firstStartedPulling="2026-03-08 19:46:39.395394959 +0000 UTC m=+900.791448982" lastFinishedPulling="2026-03-08 19:46:44.186461084 +0000 UTC m=+905.582515107" observedRunningTime="2026-03-08 19:46:45.163243746 +0000 UTC m=+906.559297799" watchObservedRunningTime="2026-03-08 19:46:45.16786636 +0000 UTC m=+906.563920403" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.195556 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" podStartSLOduration=2.392141653 podStartE2EDuration="7.195537685s" podCreationTimestamp="2026-03-08 19:46:38 +0000 UTC" firstStartedPulling="2026-03-08 19:46:39.360220513 +0000 UTC m=+900.756274546" lastFinishedPulling="2026-03-08 19:46:44.163616565 +0000 UTC m=+905.559670578" observedRunningTime="2026-03-08 19:46:45.191582961 +0000 UTC m=+906.587636994" watchObservedRunningTime="2026-03-08 19:46:45.195537685 +0000 UTC m=+906.591591708" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.603939 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.663011 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities" (OuterVolumeSpecName: "utilities") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.668185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s" (OuterVolumeSpecName: "kube-api-access-njk5s") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "kube-api-access-njk5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.763911 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.763966 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.801045 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.819460 4885 scope.go:117] "RemoveContainer" containerID="6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.864633 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154424 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" exitCode=0 Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154490 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154822 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801"} Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154870 4885 scope.go:117] "RemoveContainer" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.178045 4885 scope.go:117] "RemoveContainer" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.193097 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.206243 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.224297 4885 scope.go:117] "RemoveContainer" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251167 4885 scope.go:117] "RemoveContainer" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.251556 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": container with ID starting with 9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2 not found: ID does not exist" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251588 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} err="failed to get container status \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": rpc error: code = NotFound desc = could not find container \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": container with ID starting with 9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2 not found: ID does not exist" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251609 4885 scope.go:117] "RemoveContainer" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.251812 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": container with ID starting with 36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465 not found: ID does not exist" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251833 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} err="failed to get container status \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": rpc error: code = NotFound desc = could not find container \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": container with ID starting with 36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465 not found: ID does not exist" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251844 4885 scope.go:117] "RemoveContainer" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.252110 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": container with ID starting with 04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2 not found: ID does not exist" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.252150 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2"} err="failed to get container status \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": rpc error: code = NotFound desc = could not find container \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": container with ID starting with 04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2 not found: ID does not exist" Mar 08 19:46:47 crc kubenswrapper[4885]: I0308 19:46:47.381302 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" path="/var/lib/kubelet/pods/c4ddb731-1c00-4dea-80e3-9c2d372916e9/volumes" Mar 08 19:46:59 crc kubenswrapper[4885]: I0308 19:46:59.179778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.818723 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.819169 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.819278 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.820027 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.820118 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" gracePeriod=600 Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268470 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" exitCode=0 Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268778 4885 scope.go:117] "RemoveContainer" containerID="efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" Mar 08 19:47:18 crc kubenswrapper[4885]: I0308 19:47:18.942684 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884148 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hq28v"] Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-utilities" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884486 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-utilities" Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884518 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-content" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884530 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-content" Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884555 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884736 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.887708 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.892471 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8q54c" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.893133 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.894235 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.894840 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.896696 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.899762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.912494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944542 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944582 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.985487 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5nclk"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.987384 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5nclk" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.991644 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.991894 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gmznl" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.992062 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.992163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.998800 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.004152 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.007786 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.038052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045647 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.045976 4885 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.046029 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs podName:dca42faa-df32-44b5-99e8-109120aa36a1 nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.546011571 +0000 UTC m=+941.942065584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs") pod "frr-k8s-hq28v" (UID: "dca42faa-df32-44b5-99e8-109120aa36a1") : secret "frr-k8s-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046253 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047086 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046906 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047144 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.051635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.078280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.078939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148670 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148742 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.148773 4885 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.148843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs podName:6d11a8df-ce5d-404a-b827-822101b061c8 nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.648824097 +0000 UTC m=+942.044878120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs") pod "controller-86ddb6bd46-xj2vs" (UID: "6d11a8df-ce5d-404a-b827-822101b061c8") : secret "controller-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.149257 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.149285 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.149310 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist podName:860f2bc3-9bd4-43c5-9400-67293a877c6f nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.64930259 +0000 UTC m=+942.045356613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist") pod "speaker-5nclk" (UID: "860f2bc3-9bd4-43c5-9400-67293a877c6f") : secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.151242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.151411 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.161826 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.167348 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.167498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.224151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.555706 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.562175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.619727 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.627977 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.657010 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.657106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.657311 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.657385 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist podName:860f2bc3-9bd4-43c5-9400-67293a877c6f nodeName:}" failed. No retries permitted until 2026-03-08 19:47:21.657355413 +0000 UTC m=+943.053409446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist") pod "speaker-5nclk" (UID: "860f2bc3-9bd4-43c5-9400-67293a877c6f") : secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.662664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.814265 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.935430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.236352 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.406110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" event={"ID":"e76b0259-0d11-4451-b770-4ca5611ce32e","Type":"ContainerStarted","Data":"5e79837bf94d566e49bb58b39e4fa2d9ddddb3b9673d6d4358192998839c5ad8"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.408414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"0255e84f68df5481fa8e5065c4a980ae8f7a2ddd3eb48866a76c63671eb9b18f"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.410576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"ea5c4b87c03fc4c39c57a7d50e6b1963821843539a8fca80b64cdedf268fee48"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.671775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.681063 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.800283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: W0308 19:47:21.829866 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860f2bc3_9bd4_43c5_9400_67293a877c6f.slice/crio-261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11 WatchSource:0}: Error finding container 261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11: Status 404 returned error can't find the container with id 261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11 Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.441957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"ca1dc8c263429c96ed7f0d4d929659bb9b5e74235928a34fd759f947d802a37e"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.442013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"ce97e51918924c94c787223eaf8ff957bf30e42af44b6cdd35dea62a60f75a93"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.442051 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.445397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"a8130f753791d0cc0b174b7605993009fcb0fa02b52f04ee66b350799ae716d5"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.445441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.462014 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-xj2vs" podStartSLOduration=3.461996927 podStartE2EDuration="3.461996927s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:47:22.460050965 +0000 UTC m=+943.856104978" watchObservedRunningTime="2026-03-08 19:47:22.461996927 +0000 UTC m=+943.858050950" Mar 08 19:47:23 crc kubenswrapper[4885]: I0308 19:47:23.453244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"58b5727749bb1effcca8bf3b9b3b818ecf8742ae681d8b8b7c533b3ffc0a8214"} Mar 08 19:47:23 crc kubenswrapper[4885]: I0308 19:47:23.473307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5nclk" podStartSLOduration=4.473287764 podStartE2EDuration="4.473287764s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:47:23.466950315 +0000 UTC m=+944.863004338" watchObservedRunningTime="2026-03-08 19:47:23.473287764 +0000 UTC m=+944.869341777" Mar 08 19:47:24 crc kubenswrapper[4885]: I0308 19:47:24.462200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5nclk" Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.490671 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="76efcaf109695f1158fd41ebd4ca91e20931270214257e11b962dd8c87b3ae0e" exitCode=0 Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.490783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"76efcaf109695f1158fd41ebd4ca91e20931270214257e11b962dd8c87b3ae0e"} Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.494372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" event={"ID":"e76b0259-0d11-4451-b770-4ca5611ce32e","Type":"ContainerStarted","Data":"3ac7cb82323bfaee2f9985c6c374fdc9aaae7282b3ffa4a29c875273d7e45495"} Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.494582 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.564796 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" podStartSLOduration=2.487230511 podStartE2EDuration="9.564766163s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="2026-03-08 19:47:20.627691583 +0000 UTC m=+942.023745616" lastFinishedPulling="2026-03-08 19:47:27.705227215 +0000 UTC m=+949.101281268" observedRunningTime="2026-03-08 19:47:28.553794441 +0000 UTC m=+949.949848494" watchObservedRunningTime="2026-03-08 19:47:28.564766163 +0000 UTC m=+949.960820226" Mar 08 19:47:29 crc kubenswrapper[4885]: I0308 19:47:29.504037 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="1d041f1c1898be24726f2ddc9872df824959a0536e7c1621811da9e12f86f007" exitCode=0 Mar 08 19:47:29 crc kubenswrapper[4885]: I0308 19:47:29.504135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"1d041f1c1898be24726f2ddc9872df824959a0536e7c1621811da9e12f86f007"} Mar 08 19:47:30 crc kubenswrapper[4885]: I0308 19:47:30.514663 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="f0a83d4c5ffbdd02c07df0d57d046fe2cb38681ba58744360a06633b6678ddc7" exitCode=0 Mar 08 19:47:30 crc kubenswrapper[4885]: I0308 19:47:30.514772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"f0a83d4c5ffbdd02c07df0d57d046fe2cb38681ba58744360a06633b6678ddc7"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.528701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"e43431589bddbd576a18965d13e24f51aa05e878b1b4d6351b86a3b257d4fe5e"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"ba601f4d7aa0f9f436a5837c93cb62947192f2e21fb9c00ec686129ebec5c31f"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"1e2d71c2217281200b1510b4f8a085bf2348e7cd278d485145f7f45c51c1ac56"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"f676e00218c4c463b862070415b45a4fd7030b76ce4f96565cc6162d8317103f"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"4a3729bde3dd2f058b64fea8a20289be12e31438a2767e84c7fe256b1dae9e05"} Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.542957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"dc13dca89cbe4b763e3f991f24bccdec1d3da629401113525eb14a91175de89a"} Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.543243 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.572559 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hq28v" podStartSLOduration=6.878159604 podStartE2EDuration="13.572536768s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="2026-03-08 19:47:20.985892767 +0000 UTC m=+942.381946830" lastFinishedPulling="2026-03-08 19:47:27.680269971 +0000 UTC m=+949.076323994" observedRunningTime="2026-03-08 19:47:32.569523947 +0000 UTC m=+953.965578010" watchObservedRunningTime="2026-03-08 19:47:32.572536768 +0000 UTC m=+953.968590821" Mar 08 19:47:35 crc kubenswrapper[4885]: I0308 19:47:35.814753 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:35 crc kubenswrapper[4885]: I0308 19:47:35.865325 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.066034 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.070545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.076797 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193373 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.216521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.402188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.694818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587598 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" exitCode=0 Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa"} Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerStarted","Data":"e808f25e95174e48eb13df958aaeaad16e03d9d613887ec3abec24654bd6c920"} Mar 08 19:47:38 crc kubenswrapper[4885]: I0308 19:47:38.596331 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" exitCode=0 Mar 08 19:47:38 crc kubenswrapper[4885]: I0308 19:47:38.596372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5"} Mar 08 19:47:39 crc kubenswrapper[4885]: I0308 19:47:39.608478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerStarted","Data":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} Mar 08 19:47:39 crc kubenswrapper[4885]: I0308 19:47:39.648241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tp6kf" podStartSLOduration=2.183537445 podStartE2EDuration="3.648094097s" podCreationTimestamp="2026-03-08 19:47:36 +0000 UTC" firstStartedPulling="2026-03-08 19:47:37.589825193 +0000 UTC m=+958.985879246" lastFinishedPulling="2026-03-08 19:47:39.054381835 +0000 UTC m=+960.450435898" observedRunningTime="2026-03-08 19:47:39.643041593 +0000 UTC m=+961.039095626" watchObservedRunningTime="2026-03-08 19:47:39.648094097 +0000 UTC m=+961.044148160" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.230363 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.817221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.941488 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:41 crc kubenswrapper[4885]: I0308 19:47:41.804897 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5nclk" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.479752 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.481076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.483534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502140 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.507557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603704 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.604463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.604527 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.645454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.815966 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.146675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:44 crc kubenswrapper[4885]: W0308 19:47:44.152693 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5bcb33a_118a_438a_86f5_467399e36ddb.slice/crio-ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115 WatchSource:0}: Error finding container ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115: Status 404 returned error can't find the container with id ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115 Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658185 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="ada4fe3f29671378ed74f7751b862f298d82e4bfe3218bda000ee6b7d051019e" exitCode=0 Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"ada4fe3f29671378ed74f7751b862f298d82e4bfe3218bda000ee6b7d051019e"} Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerStarted","Data":"ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115"} Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.403779 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.404066 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.467777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.713083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.690209 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="6a17b11e7389ca9fe5329be47b679e399b863f15c11231cd63043f842d203a31" exitCode=0 Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.690307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"6a17b11e7389ca9fe5329be47b679e399b863f15c11231cd63043f842d203a31"} Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.833590 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.833963 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tp6kf" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" containerID="cri-o://793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" gracePeriod=2 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.359289 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499423 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499641 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.501416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities" (OuterVolumeSpecName: "utilities") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.505832 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt" (OuterVolumeSpecName: "kube-api-access-zpnqt") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "kube-api-access-zpnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.547701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601378 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601434 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601452 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.702896 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" exitCode=0 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703134 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"e808f25e95174e48eb13df958aaeaad16e03d9d613887ec3abec24654bd6c920"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703189 4885 scope.go:117] "RemoveContainer" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.709091 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="34f80d156a9821d5b0ff15771041d9780c29ee9ec79de322687e5239864c0cb2" exitCode=0 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.709156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"34f80d156a9821d5b0ff15771041d9780c29ee9ec79de322687e5239864c0cb2"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.746743 4885 scope.go:117] "RemoveContainer" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.776211 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.781766 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.798204 4885 scope.go:117] "RemoveContainer" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.817611 4885 scope.go:117] "RemoveContainer" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.818222 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": container with ID starting with 793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d not found: ID does not exist" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.818273 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} err="failed to get container status \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": rpc error: code = NotFound desc = could not find container \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": container with ID starting with 793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d not found: ID does not exist" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.818307 4885 scope.go:117] "RemoveContainer" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.818797 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": container with ID starting with f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5 not found: ID does not exist" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819041 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5"} err="failed to get container status \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": rpc error: code = NotFound desc = could not find container \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": container with ID starting with f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5 not found: ID does not exist" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819214 4885 scope.go:117] "RemoveContainer" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.819565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": container with ID starting with 18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa not found: ID does not exist" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819587 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa"} err="failed to get container status \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": rpc error: code = NotFound desc = could not find container \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": container with ID starting with 18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa not found: ID does not exist" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.073513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230196 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230356 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.231469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle" (OuterVolumeSpecName: "bundle") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.237271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw" (OuterVolumeSpecName: "kube-api-access-zcrnw") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "kube-api-access-zcrnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.240613 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util" (OuterVolumeSpecName: "util") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332643 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332705 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332725 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.381124 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" path="/var/lib/kubelet/pods/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0/volumes" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731019 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115"} Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731094 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731131 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.490688 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491313 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491333 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491348 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-content" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491356 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-content" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491363 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491371 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-utilities" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491391 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-utilities" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491402 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="util" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="util" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491425 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="pull" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491433 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="pull" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.492072 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.494943 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-mh9vk" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.495059 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.508281 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.525948 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.672033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.672085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.773860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.774019 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.774535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.801813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.818612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:55 crc kubenswrapper[4885]: I0308 19:47:55.267236 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:55 crc kubenswrapper[4885]: W0308 19:47:55.277146 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda393ebaa_f427_44ff_965d_c1c65ab661ba.slice/crio-7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a WatchSource:0}: Error finding container 7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a: Status 404 returned error can't find the container with id 7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a Mar 08 19:47:55 crc kubenswrapper[4885]: I0308 19:47:55.755479 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" event={"ID":"a393ebaa-f427-44ff-965d-c1c65ab661ba","Type":"ContainerStarted","Data":"7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a"} Mar 08 19:47:58 crc kubenswrapper[4885]: I0308 19:47:58.787595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" event={"ID":"a393ebaa-f427-44ff-965d-c1c65ab661ba","Type":"ContainerStarted","Data":"d91f8f7282045a6c1b3a077232b0d4c4e2e38f556f39912f0ba6add662d2a0ac"} Mar 08 19:47:58 crc kubenswrapper[4885]: I0308 19:47:58.821275 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" podStartSLOduration=1.593387292 podStartE2EDuration="4.821248827s" podCreationTimestamp="2026-03-08 19:47:54 +0000 UTC" firstStartedPulling="2026-03-08 19:47:55.280328839 +0000 UTC m=+976.676382882" lastFinishedPulling="2026-03-08 19:47:58.508190354 +0000 UTC m=+979.904244417" observedRunningTime="2026-03-08 19:47:58.812729101 +0000 UTC m=+980.208783124" watchObservedRunningTime="2026-03-08 19:47:58.821248827 +0000 UTC m=+980.217302880" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.157506 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.158191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.160870 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.161603 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.166182 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.170171 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.348737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.450021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.474972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.771658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:01 crc kubenswrapper[4885]: I0308 19:48:01.084176 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:01 crc kubenswrapper[4885]: I0308 19:48:01.806714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerStarted","Data":"a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2"} Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.406538 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.407577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.409796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.410072 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.412878 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-s2vj9" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.426553 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.577963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.578061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.644497 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.645153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.647911 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8nw7z" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.662189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.680174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.680251 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.700969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.701641 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.730660 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.781316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.781583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.821468 4885 generic.go:334] "Generic (PLEG): container finished" podID="a1daba97-3389-4e45-8a6c-bf910619f315" containerID="027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e" exitCode=0 Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.821526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerDied","Data":"027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e"} Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.883230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.883333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.901222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.902175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.947396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.957269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.353590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:03 crc kubenswrapper[4885]: W0308 19:48:03.360670 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62feb91_9474_41c0_b79c_93f3f6dd830b.slice/crio-40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa WatchSource:0}: Error finding container 40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa: Status 404 returned error can't find the container with id 40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.836195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" event={"ID":"de1b5c94-7518-46c5-af4a-2b692d23b3b7","Type":"ContainerStarted","Data":"1ee95e3c9eeb42138759c5e8f5f4e7b9f1c317fbf5599a74999a390cfa925578"} Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.839493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" event={"ID":"d62feb91-9474-41c0-b79c-93f3f6dd830b","Type":"ContainerStarted","Data":"40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa"} Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.143387 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.201110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"a1daba97-3389-4e45-8a6c-bf910619f315\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.206086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h" (OuterVolumeSpecName: "kube-api-access-rxw5h") pod "a1daba97-3389-4e45-8a6c-bf910619f315" (UID: "a1daba97-3389-4e45-8a6c-bf910619f315"). InnerVolumeSpecName "kube-api-access-rxw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.302883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerDied","Data":"a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2"} Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848644 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848650 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2" Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.193352 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.197789 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.374246 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" path="/var/lib/kubelet/pods/30a90e18-1089-40ae-a5f0-f43b1d252129/volumes" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.895466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" event={"ID":"d62feb91-9474-41c0-b79c-93f3f6dd830b","Type":"ContainerStarted","Data":"942328ef56ffd155036408f96b9736bc5c67207a26dc77713fffc4c57a453745"} Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.897683 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" event={"ID":"de1b5c94-7518-46c5-af4a-2b692d23b3b7","Type":"ContainerStarted","Data":"6f96863a032c2582c3555288d36bfc786629f773c77ae05858ca77a1bbb00362"} Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.897838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.926180 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" podStartSLOduration=2.05786414 podStartE2EDuration="5.926157382s" podCreationTimestamp="2026-03-08 19:48:02 +0000 UTC" firstStartedPulling="2026-03-08 19:48:03.36206761 +0000 UTC m=+984.758121633" lastFinishedPulling="2026-03-08 19:48:07.230360862 +0000 UTC m=+988.626414875" observedRunningTime="2026-03-08 19:48:07.921000384 +0000 UTC m=+989.317054417" watchObservedRunningTime="2026-03-08 19:48:07.926157382 +0000 UTC m=+989.322211415" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.945329 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" podStartSLOduration=1.6945314580000002 podStartE2EDuration="5.945314341s" podCreationTimestamp="2026-03-08 19:48:02 +0000 UTC" firstStartedPulling="2026-03-08 19:48:02.953668219 +0000 UTC m=+984.349722252" lastFinishedPulling="2026-03-08 19:48:07.204451112 +0000 UTC m=+988.600505135" observedRunningTime="2026-03-08 19:48:07.942282471 +0000 UTC m=+989.338336504" watchObservedRunningTime="2026-03-08 19:48:07.945314341 +0000 UTC m=+989.341368374" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.525524 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:10 crc kubenswrapper[4885]: E0308 19:48:10.526185 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.526206 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.526396 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.527804 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.561503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689131 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.790688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791070 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791936 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.817456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.906624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:11 crc kubenswrapper[4885]: I0308 19:48:11.347781 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:11 crc kubenswrapper[4885]: I0308 19:48:11.936466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"0a73aa511408aa95a3e655b73c27028296781e4764c3f2dfa59ddabd59e8bddb"} Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.735698 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.947079 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" exitCode=0 Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.947141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2"} Mar 08 19:48:13 crc kubenswrapper[4885]: I0308 19:48:13.957115 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} Mar 08 19:48:14 crc kubenswrapper[4885]: I0308 19:48:14.964392 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" exitCode=0 Mar 08 19:48:14 crc kubenswrapper[4885]: I0308 19:48:14.964439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} Mar 08 19:48:15 crc kubenswrapper[4885]: I0308 19:48:15.979132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} Mar 08 19:48:16 crc kubenswrapper[4885]: I0308 19:48:16.005178 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2v5mz" podStartSLOduration=3.370326781 podStartE2EDuration="6.005158062s" podCreationTimestamp="2026-03-08 19:48:10 +0000 UTC" firstStartedPulling="2026-03-08 19:48:12.950238879 +0000 UTC m=+994.346292932" lastFinishedPulling="2026-03-08 19:48:15.58507018 +0000 UTC m=+996.981124213" observedRunningTime="2026-03-08 19:48:16.001981346 +0000 UTC m=+997.398035399" watchObservedRunningTime="2026-03-08 19:48:16.005158062 +0000 UTC m=+997.401212095" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.009735 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.011749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.014826 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r5mnt" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.027690 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.075120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.075457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.177592 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.177991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.200506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.206785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.341302 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.860679 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.024489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8wbq2" event={"ID":"6da97aa0-4c69-414f-8fda-23403d2346e5","Type":"ContainerStarted","Data":"6fbea6e70aed64f786da673b7e230c3abec1c21d00085441fc32925adc52e19d"} Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.907424 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.907729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.978814 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.035413 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8wbq2" event={"ID":"6da97aa0-4c69-414f-8fda-23403d2346e5","Type":"ContainerStarted","Data":"38e60205adb96f6813b47ff0dfa9d44933c7a553a8ea2f867b06bee478816eda"} Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.055115 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-8wbq2" podStartSLOduration=3.055095735 podStartE2EDuration="3.055095735s" podCreationTimestamp="2026-03-08 19:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:48:21.054362756 +0000 UTC m=+1002.450416819" watchObservedRunningTime="2026-03-08 19:48:21.055095735 +0000 UTC m=+1002.451149768" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.100250 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.225977 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.049441 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2v5mz" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" containerID="cri-o://919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" gracePeriod=2 Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.478445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.560688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.560837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.561061 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.562009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities" (OuterVolumeSpecName: "utilities") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.570227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm" (OuterVolumeSpecName: "kube-api-access-dlhwm") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "kube-api-access-dlhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.635846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663049 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663121 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663154 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062252 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" exitCode=0 Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"0a73aa511408aa95a3e655b73c27028296781e4764c3f2dfa59ddabd59e8bddb"} Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062389 4885 scope.go:117] "RemoveContainer" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.091991 4885 scope.go:117] "RemoveContainer" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.116049 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.122321 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.136593 4885 scope.go:117] "RemoveContainer" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161246 4885 scope.go:117] "RemoveContainer" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.161760 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": container with ID starting with 919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99 not found: ID does not exist" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161809 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} err="failed to get container status \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": rpc error: code = NotFound desc = could not find container \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": container with ID starting with 919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99 not found: ID does not exist" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161844 4885 scope.go:117] "RemoveContainer" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.162434 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": container with ID starting with 3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3 not found: ID does not exist" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.162507 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} err="failed to get container status \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": rpc error: code = NotFound desc = could not find container \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": container with ID starting with 3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3 not found: ID does not exist" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.162554 4885 scope.go:117] "RemoveContainer" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.163025 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": container with ID starting with 465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2 not found: ID does not exist" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.163071 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2"} err="failed to get container status \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": rpc error: code = NotFound desc = could not find container \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": container with ID starting with 465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2 not found: ID does not exist" Mar 08 19:48:25 crc kubenswrapper[4885]: I0308 19:48:25.380983 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" path="/var/lib/kubelet/pods/a3dca148-e7cb-4675-918c-e773447bdcf4/volumes" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749446 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749778 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749800 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749821 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-utilities" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749833 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-utilities" Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749860 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-content" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-content" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.750084 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.750656 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-trhqw" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753190 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753240 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.769869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.808256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.909328 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.933370 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:27 crc kubenswrapper[4885]: I0308 19:48:27.069025 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:27 crc kubenswrapper[4885]: I0308 19:48:27.291213 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:28 crc kubenswrapper[4885]: I0308 19:48:28.100941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerStarted","Data":"3430220374f71b013dec40d23d384e699c7dd793743ffe3164939afc082b9453"} Mar 08 19:48:29 crc kubenswrapper[4885]: I0308 19:48:29.111766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerStarted","Data":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} Mar 08 19:48:29 crc kubenswrapper[4885]: I0308 19:48:29.141722 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5267w" podStartSLOduration=2.231652542 podStartE2EDuration="3.141692345s" podCreationTimestamp="2026-03-08 19:48:26 +0000 UTC" firstStartedPulling="2026-03-08 19:48:27.301617268 +0000 UTC m=+1008.697671301" lastFinishedPulling="2026-03-08 19:48:28.211657041 +0000 UTC m=+1009.607711104" observedRunningTime="2026-03-08 19:48:29.132396877 +0000 UTC m=+1010.528450930" watchObservedRunningTime="2026-03-08 19:48:29.141692345 +0000 UTC m=+1010.537746398" Mar 08 19:48:30 crc kubenswrapper[4885]: I0308 19:48:30.829780 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.127650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5267w" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" containerID="cri-o://1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" gracePeriod=2 Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.444028 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.445190 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.483587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.592552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.593811 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.694087 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.694371 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.703366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp" (OuterVolumeSpecName: "kube-api-access-l4htp") pod "fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" (UID: "fa9e3b2e-4be2-420d-a812-37c7eddfcb1f"). InnerVolumeSpecName "kube-api-access-l4htp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.732672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.795016 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.795424 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139238 4885 generic.go:334] "Generic (PLEG): container finished" podID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" exitCode=0 Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerDied","Data":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerDied","Data":"3430220374f71b013dec40d23d384e699c7dd793743ffe3164939afc082b9453"} Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139705 4885 scope.go:117] "RemoveContainer" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.178309 4885 scope.go:117] "RemoveContainer" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: E0308 19:48:32.178809 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": container with ID starting with 1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802 not found: ID does not exist" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.178863 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} err="failed to get container status \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": rpc error: code = NotFound desc = could not find container \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": container with ID starting with 1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802 not found: ID does not exist" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.179855 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.184977 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.232036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:32 crc kubenswrapper[4885]: W0308 19:48:32.234518 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024a1da8_dfa6_4cdc_a5ec_12b9ce56969a.slice/crio-ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08 WatchSource:0}: Error finding container ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08: Status 404 returned error can't find the container with id ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08 Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.151116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4b99" event={"ID":"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a","Type":"ContainerStarted","Data":"0d115ef311167762f2e12c7320218c342afcea251defcdda70b43918aff8c331"} Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.151428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4b99" event={"ID":"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a","Type":"ContainerStarted","Data":"ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08"} Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.178899 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w4b99" podStartSLOduration=1.7407426209999999 podStartE2EDuration="2.178871302s" podCreationTimestamp="2026-03-08 19:48:31 +0000 UTC" firstStartedPulling="2026-03-08 19:48:32.240033894 +0000 UTC m=+1013.636087927" lastFinishedPulling="2026-03-08 19:48:32.678162545 +0000 UTC m=+1014.074216608" observedRunningTime="2026-03-08 19:48:33.170058818 +0000 UTC m=+1014.566112881" watchObservedRunningTime="2026-03-08 19:48:33.178871302 +0000 UTC m=+1014.574925365" Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.381541 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" path="/var/lib/kubelet/pods/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f/volumes" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.796205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.796886 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.833355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:42 crc kubenswrapper[4885]: I0308 19:48:42.276062 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:45 crc kubenswrapper[4885]: I0308 19:48:45.924261 4885 scope.go:117] "RemoveContainer" containerID="c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.988573 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:48 crc kubenswrapper[4885]: E0308 19:48:48.989557 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.989581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.989754 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.991018 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.993340 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wvm7p" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.016612 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.095634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.095913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.096061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197951 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.198473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.198590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.236434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.307462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.760942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296278 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="e6ee16a25650b52efd83ec25e72bba06d68b2f1bd78837a8a0af9244697e46f5" exitCode=0 Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"e6ee16a25650b52efd83ec25e72bba06d68b2f1bd78837a8a0af9244697e46f5"} Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296690 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerStarted","Data":"76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1"} Mar 08 19:48:52 crc kubenswrapper[4885]: I0308 19:48:52.323822 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="722a49d08a61a9f585e665550df0274c058721ad95c30abe5744d3106e0a637f" exitCode=0 Mar 08 19:48:52 crc kubenswrapper[4885]: I0308 19:48:52.323908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"722a49d08a61a9f585e665550df0274c058721ad95c30abe5744d3106e0a637f"} Mar 08 19:48:53 crc kubenswrapper[4885]: I0308 19:48:53.336811 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="6f1f51906cfdbcd3ba2235908cd64569180d6b7ce50833fc3b3b9f7d32adb160" exitCode=0 Mar 08 19:48:53 crc kubenswrapper[4885]: I0308 19:48:53.336999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"6f1f51906cfdbcd3ba2235908cd64569180d6b7ce50833fc3b3b9f7d32adb160"} Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.683989 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.799722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle" (OuterVolumeSpecName: "bundle") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.806349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh" (OuterVolumeSpecName: "kube-api-access-jtqnh") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "kube-api-access-jtqnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.831062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util" (OuterVolumeSpecName: "util") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900828 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900881 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900900 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357843 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1"} Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357885 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1" Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357953 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.947399 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948084 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="util" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948106 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="util" Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948126 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="pull" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948138 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="pull" Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948160 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948173 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948379 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.949852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.963872 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.964195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.964258 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.972474 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.065870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066401 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066658 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.067006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.088582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.284632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.559605 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395297 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" exitCode=0 Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037"} Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395548 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"dc27c8a1889ec091ab9edee5fb97f87ce1898f17ea1459b60bfccc24e32126ec"} Mar 08 19:49:01 crc kubenswrapper[4885]: I0308 19:49:01.414091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.003698 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.004471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.006070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-s4sdf" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.028890 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.125856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.227547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.247771 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.320549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.426475 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" exitCode=0 Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.426520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.621146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: W0308 19:49:02.627109 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9acb4d66_3a49_42b7_bd78_4d904f080c50.slice/crio-f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218 WatchSource:0}: Error finding container f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218: Status 404 returned error can't find the container with id f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218 Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.442814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.444220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" event={"ID":"9acb4d66-3a49-42b7-bd78-4d904f080c50","Type":"ContainerStarted","Data":"f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218"} Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.465751 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8gsx" podStartSLOduration=3.028008525 podStartE2EDuration="5.465730899s" podCreationTimestamp="2026-03-08 19:48:58 +0000 UTC" firstStartedPulling="2026-03-08 19:49:00.396842375 +0000 UTC m=+1041.792896398" lastFinishedPulling="2026-03-08 19:49:02.834564749 +0000 UTC m=+1044.230618772" observedRunningTime="2026-03-08 19:49:03.461145506 +0000 UTC m=+1044.857199549" watchObservedRunningTime="2026-03-08 19:49:03.465730899 +0000 UTC m=+1044.861784932" Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.475311 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" event={"ID":"9acb4d66-3a49-42b7-bd78-4d904f080c50","Type":"ContainerStarted","Data":"9d4a02bf406ef164aebb4bd1746fa43fc8883eecb3eff198f8b96f01188e9923"} Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.476126 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.542819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" podStartSLOduration=2.048639298 podStartE2EDuration="6.542798848s" podCreationTimestamp="2026-03-08 19:49:01 +0000 UTC" firstStartedPulling="2026-03-08 19:49:02.628942216 +0000 UTC m=+1044.024996239" lastFinishedPulling="2026-03-08 19:49:07.123101746 +0000 UTC m=+1048.519155789" observedRunningTime="2026-03-08 19:49:07.53991634 +0000 UTC m=+1048.935970363" watchObservedRunningTime="2026-03-08 19:49:07.542798848 +0000 UTC m=+1048.938852881" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.285639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.285713 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.381346 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.557155 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:10 crc kubenswrapper[4885]: I0308 19:49:10.314203 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.512700 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8gsx" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" containerID="cri-o://531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" gracePeriod=2 Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.926858 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965906 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.967174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities" (OuterVolumeSpecName: "utilities") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.974864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb" (OuterVolumeSpecName: "kube-api-access-7vggb") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "kube-api-access-7vggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.067716 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.067757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.224996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.269777 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.324130 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522641 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" exitCode=0 Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"dc27c8a1889ec091ab9edee5fb97f87ce1898f17ea1459b60bfccc24e32126ec"} Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522778 4885 scope.go:117] "RemoveContainer" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522994 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.541795 4885 scope.go:117] "RemoveContainer" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.560327 4885 scope.go:117] "RemoveContainer" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.585625 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.586940 4885 scope.go:117] "RemoveContainer" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.589593 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": container with ID starting with 531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2 not found: ID does not exist" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.589643 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} err="failed to get container status \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": rpc error: code = NotFound desc = could not find container \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": container with ID starting with 531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.589674 4885 scope.go:117] "RemoveContainer" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.589974 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": container with ID starting with 7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832 not found: ID does not exist" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590003 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} err="failed to get container status \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": rpc error: code = NotFound desc = could not find container \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": container with ID starting with 7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590021 4885 scope.go:117] "RemoveContainer" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.590312 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": container with ID starting with c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037 not found: ID does not exist" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590369 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037"} err="failed to get container status \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": rpc error: code = NotFound desc = could not find container \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": container with ID starting with c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590386 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:13 crc kubenswrapper[4885]: I0308 19:49:13.382372 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949c8a20-4064-489d-b823-8eb76415df83" path="/var/lib/kubelet/pods/949c8a20-4064-489d-b823-8eb76415df83/volumes" Mar 08 19:49:32 crc kubenswrapper[4885]: I0308 19:49:32.838667 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:49:32 crc kubenswrapper[4885]: I0308 19:49:32.839667 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719754 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-content" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719766 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-content" Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719782 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719788 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719803 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-utilities" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-utilities" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719915 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.720297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.722415 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fwd6q" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.727322 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.728222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.733792 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-cbj2v" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.743908 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.745202 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.747350 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4bgw5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.748155 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.752006 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.757347 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.777417 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.780115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.784502 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p6wb8" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.793651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.858999 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.860023 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.865261 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2cmxp" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.868984 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.870024 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.871499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-25k8p" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.878912 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.892407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.900258 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.901842 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.904213 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2jvqx" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.904369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.921123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.932574 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.933354 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.935876 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vmjjc" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.938987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940208 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940261 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.944994 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.946031 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.949382 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fspv5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.955993 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.961291 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.966366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.971736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.971773 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.972032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.973196 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.976779 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kwttv" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.976859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.977036 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-44779" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.977412 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.981287 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.991381 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.995472 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c6pxs" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.997223 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.008602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042514 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042582 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042609 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042738 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042777 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.047865 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.060591 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.064979 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.065746 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.068069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ptvkv" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.070956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.073554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.087124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.087530 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.091041 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.091928 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.093767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4w2rv" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.095356 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.096249 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.100113 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.100545 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hlblr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.105491 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.110131 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.114951 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.116181 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.118287 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.121590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-28nm4" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.130582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.131782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.137248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xnzlb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.139623 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.144532 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145650 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.146068 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.146120 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:50.646105704 +0000 UTC m=+1092.042159727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.147723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.176403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.177590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.179428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.186512 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.187450 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.192132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.194363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.198119 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-brs4j" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.198945 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.231627 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248705 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248845 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.249384 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.266879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.267316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.268181 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.268489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.269037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.270148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.272482 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.276009 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-c2b9s" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.281632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.339356 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.341154 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.343052 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.345512 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4cwrh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349734 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.350030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.350123 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.350171 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:50.850156466 +0000 UTC m=+1092.246210479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.356099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.383462 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.387142 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.393177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.396010 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.406848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.426838 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.427696 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.439533 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.447970 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5hdn5" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.448211 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.452763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.454134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.454293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.471553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.483351 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.488739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.496077 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.503636 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.510617 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.511543 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.518192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.532662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.533141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.538772 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qzlfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.538911 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.539026 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.557411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.557574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.561908 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.563204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.573302 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dxr5z" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.583439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.586419 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.601860 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.616554 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658540 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.659082 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.659125 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.659110669 +0000 UTC m=+1093.055164692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.666577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.675272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.682128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.759721 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.759838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.761300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.761357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.761941 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.761985 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.261971437 +0000 UTC m=+1092.658025460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.762183 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.762209 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.262201273 +0000 UTC m=+1092.658255296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.776982 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.782451 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.790547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.806995 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.832004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" event={"ID":"5f89ecdd-60c3-4da6-b185-1f044d8ffc46","Type":"ContainerStarted","Data":"48b0f86ce995df16a2a3fc037110d4df83435b585f5cda5e3cc460093a361871"} Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.843806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" event={"ID":"92716f38-db4c-41d9-962d-f3cc2669a7fb","Type":"ContainerStarted","Data":"756492bc66b59da92f7b4c56ebdb96f7b5fa6f316e89c45f1ccfe8d28edcee87"} Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.862521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.862750 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.862846 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.862827771 +0000 UTC m=+1093.258881794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.892724 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.051513 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.071243 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.076479 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7180efa7_8d93_436e_8de2_78fe5c173843.slice/crio-b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464 WatchSource:0}: Error finding container b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464: Status 404 returned error can't find the container with id b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.094638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.109449 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4742ab81_6c6d_43c8_8025_6a656b8c40dc.slice/crio-c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0 WatchSource:0}: Error finding container c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0: Status 404 returned error can't find the container with id c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.113317 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.123838 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.199345 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.208771 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c05f3ed_fe8f_47db_b596_8b90b96c295c.slice/crio-5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9 WatchSource:0}: Error finding container 5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9: Status 404 returned error can't find the container with id 5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.210906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.227343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.239561 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.239963 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27aa3877_54cd_414d_80a0_ab20a68ed535.slice/crio-ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc WatchSource:0}: Error finding container ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc: Status 404 returned error can't find the container with id ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.268948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.269014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269253 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269309 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:52.269291111 +0000 UTC m=+1093.665345144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269682 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269725 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:52.269713962 +0000 UTC m=+1093.665767985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.380857 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.380892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.384661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.388874 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrzrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-xf4hm_openstack-operators(ea5acc0f-2ad8-46d5-80a2-502e2900fdd6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.390039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.398394 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7snbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-k4r6w_openstack-operators(bbb8966a-e61f-427d-af2a-0fdab2348d03): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.399855 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.400516 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.409219 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.423374 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.428863 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mtkpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-4gfw2_openstack-operators(8d086566-6154-4ddd-8028-a9c203cfec11): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.430151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.431437 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kt8f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-wdrfh_openstack-operators(44fbac8d-d81f-4c03-9555-ef33551d478d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.432759 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.511902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.519577 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8caa87f_832f_4436_beaa_aaa505de3bac.slice/crio-1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1 WatchSource:0}: Error finding container 1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1: Status 404 returned error can't find the container with id 1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.520059 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.522911 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-snn8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pd9b2_openstack-operators(a8caa87f-832f-4436-beaa-aaa505de3bac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.524854 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.526310 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5136d34_82a8_47c5_9d7d_09e0206587e8.slice/crio-5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2 WatchSource:0}: Error finding container 5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2: Status 404 returned error can't find the container with id 5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2 Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.528728 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mkl5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-66zgf_openstack-operators(d5136d34-82a8-47c5-9d7d-09e0206587e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.530203 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.673729 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.673942 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.674047 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:53.674029223 +0000 UTC m=+1095.070083246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.862457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" event={"ID":"d5136d34-82a8-47c5-9d7d-09e0206587e8","Type":"ContainerStarted","Data":"5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.863974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" event={"ID":"7c05f3ed-fe8f-47db-b596-8b90b96c295c","Type":"ContainerStarted","Data":"5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.876903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.877143 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.878035 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.878218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:53.878201797 +0000 UTC m=+1095.274255820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.878601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" event={"ID":"27aa3877-54cd-414d-80a0-ab20a68ed535","Type":"ContainerStarted","Data":"ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.879668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" event={"ID":"d5770638-6059-4ce5-b401-84b0155589a3","Type":"ContainerStarted","Data":"00f2e860692473c9453bb342db6fb5e86415cef335b39e93b7cd6e8d51ea80ea"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.881836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" event={"ID":"bbb8966a-e61f-427d-af2a-0fdab2348d03","Type":"ContainerStarted","Data":"7de47ba1e2ce934c6011ab0166bb5689c1ea770d87a8cccf125ef3004a8c9db8"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.882892 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.883764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" event={"ID":"a8caa87f-832f-4436-beaa-aaa505de3bac","Type":"ContainerStarted","Data":"1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.888207 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.890721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" event={"ID":"8d086566-6154-4ddd-8028-a9c203cfec11","Type":"ContainerStarted","Data":"4fc553a4c42a95d7fcb95abe6357e3c0699d49e15dfe1a6b2c7a44f1ed212a6a"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.895675 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.895685 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" event={"ID":"157555d5-ca64-49f8-8849-cd763c83feda","Type":"ContainerStarted","Data":"55b8e268ae419efc6beb90b074b226ced0641fd94c30416a4f9e9b6518087557"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.897517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" event={"ID":"4742ab81-6c6d-43c8-8025-6a656b8c40dc","Type":"ContainerStarted","Data":"c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.918864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" event={"ID":"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6","Type":"ContainerStarted","Data":"6f39c495421440ea87ad5b93b507d32dd20fb01def89d8d8ae657ed805fb2786"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.935139 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.938544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" event={"ID":"44fbac8d-d81f-4c03-9555-ef33551d478d","Type":"ContainerStarted","Data":"1ad56fbe65ece63433e9568e838f9491891381c7bad7489a040a9232220f6bef"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.944334 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.946124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" event={"ID":"45c29030-0945-4655-b035-d75e8bf0f818","Type":"ContainerStarted","Data":"dc28f5b1f2bea8e9a119ba3f574760ac3ab69da97cef38ddebaf640a31c93cbd"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.985307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" event={"ID":"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b","Type":"ContainerStarted","Data":"ef7e432e55054de16167cb7df63e6b9c98f51a80335795e4e8c80fdcc02bc95b"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.988652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" event={"ID":"7180efa7-8d93-436e-8de2-78fe5c173843","Type":"ContainerStarted","Data":"b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.989598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" event={"ID":"392750e0-9d71-418d-89b0-ec10f33ec505","Type":"ContainerStarted","Data":"e42f2ec6f62fe167702edf74845cf83c2fde632481abbcdf07e044947854e598"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.990325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" event={"ID":"d9580392-741e-406b-b72d-91aa945f65c2","Type":"ContainerStarted","Data":"a2a924fea82282c74f5dda0c1eccc6dee4431859ceceb7b009efb2c4df26e91e"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.991029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" event={"ID":"8f363429-f2b7-468c-b74b-ef14ebfab90e","Type":"ContainerStarted","Data":"d4e9495a6c0d1aa363282f9491bf343890738fe0e57586495f940e24e4a743da"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.992735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" event={"ID":"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3","Type":"ContainerStarted","Data":"c62e59e3dd7a0d4f390165a5b9d055d4cc6b5e34d4082535e927defcaae047af"} Mar 08 19:49:52 crc kubenswrapper[4885]: I0308 19:49:52.296774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:52 crc kubenswrapper[4885]: I0308 19:49:52.296842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.296993 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.296992 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.297045 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:54.297030286 +0000 UTC m=+1095.693084299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.297058 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:54.297051656 +0000 UTC m=+1095.693105679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026515 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026629 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026662 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:53 crc kubenswrapper[4885]: I0308 19:49:53.722286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.722507 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.722726 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:57.722709263 +0000 UTC m=+1099.118763286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: I0308 19:49:53.931590 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.932041 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.932126 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:57.932086355 +0000 UTC m=+1099.328140378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: I0308 19:49:54.337983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:54 crc kubenswrapper[4885]: I0308 19:49:54.338067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338156 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338228 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:58.338209305 +0000 UTC m=+1099.734263328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338222 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338299 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:58.338281317 +0000 UTC m=+1099.734335340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:57 crc kubenswrapper[4885]: I0308 19:49:57.799270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:57 crc kubenswrapper[4885]: E0308 19:49:57.800116 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:57 crc kubenswrapper[4885]: E0308 19:49:57.800381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:50:05.800355706 +0000 UTC m=+1107.196409729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.003214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.003356 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.003588 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.003574205 +0000 UTC m=+1107.399628228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.411448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.411958 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.412046 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.412027227 +0000 UTC m=+1107.808081250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.412971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.413078 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.413117 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.413107616 +0000 UTC m=+1107.809161639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.146776 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.148772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.157856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.184697 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.184954 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.185429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.341954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.444022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.486636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.512753 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:02 crc kubenswrapper[4885]: I0308 19:50:02.818015 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:50:02 crc kubenswrapper[4885]: I0308 19:50:02.818424 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.770329 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.770688 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l76dz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54688575f-p8r6f_openstack-operators(392750e0-9d71-418d-89b0-ec10f33ec505): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.772488 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podUID="392750e0-9d71-418d-89b0-ec10f33ec505" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.134283 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podUID="392750e0-9d71-418d-89b0-ec10f33ec505" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.289813 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.290002 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ctx6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-7vtx7_openstack-operators(7c05f3ed-fe8f-47db-b596-8b90b96c295c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.292057 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podUID="7c05f3ed-fe8f-47db-b596-8b90b96c295c" Mar 08 19:50:05 crc kubenswrapper[4885]: I0308 19:50:05.834682 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.834866 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.834909 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:50:21.834896361 +0000 UTC m=+1123.230950384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.857605 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.858295 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6pl2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-hlpjf_openstack-operators(157555d5-ca64-49f8-8849-cd763c83feda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.859673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.037215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.047091 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.079717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.141735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" event={"ID":"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3","Type":"ContainerStarted","Data":"4250c26dd3c57856cdfdfa6f65fcfcfa2196a6c66abbbfef44e518244929cd2b"} Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.142143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:50:06 crc kubenswrapper[4885]: E0308 19:50:06.143748 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" Mar 08 19:50:06 crc kubenswrapper[4885]: E0308 19:50:06.147247 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podUID="7c05f3ed-fe8f-47db-b596-8b90b96c295c" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.168663 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" podStartSLOduration=2.453742179 podStartE2EDuration="17.168634024s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.114662734 +0000 UTC m=+1092.510716757" lastFinishedPulling="2026-03-08 19:50:05.829554579 +0000 UTC m=+1107.225608602" observedRunningTime="2026-03-08 19:50:06.159739257 +0000 UTC m=+1107.555793280" watchObservedRunningTime="2026-03-08 19:50:06.168634024 +0000 UTC m=+1107.564688047" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.232486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:06 crc kubenswrapper[4885]: W0308 19:50:06.285145 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bff80c_e537_4de5_8a05_85ee81004c30.slice/crio-98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128 WatchSource:0}: Error finding container 98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128: Status 404 returned error can't find the container with id 98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128 Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.442008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.442063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.446301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.446346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.463611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.636355 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.922203 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:50:06 crc kubenswrapper[4885]: W0308 19:50:06.952332 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeedb14e_007e_44eb_bd52_85bbc12d0bec.slice/crio-774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35 WatchSource:0}: Error finding container 774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35: Status 404 returned error can't find the container with id 774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35 Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.152170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" event={"ID":"45c29030-0945-4655-b035-d75e8bf0f818","Type":"ContainerStarted","Data":"ca4a046cee7106430c39b1f13340ce02a94034306d8aab1be4b4f4319a59ec0e"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.152321 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.157315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" event={"ID":"d9580392-741e-406b-b72d-91aa945f65c2","Type":"ContainerStarted","Data":"2412e484cdbe2a57c60390700ee7e9fdcc59955083ced3ab2b57c998c4dfd2c9"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.157439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.182908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" event={"ID":"92716f38-db4c-41d9-962d-f3cc2669a7fb","Type":"ContainerStarted","Data":"91c281577b94a62718ec146ee4d85edcbe9f7a1cffc7e836d845780ef133c30f"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.183548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.196672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" event={"ID":"8f363429-f2b7-468c-b74b-ef14ebfab90e","Type":"ContainerStarted","Data":"301a2cad9852f68f554a8b1c758196640d8e623c6c0f3a301a797432b2d3eaeb"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.196960 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.203154 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" event={"ID":"5f89ecdd-60c3-4da6-b185-1f044d8ffc46","Type":"ContainerStarted","Data":"0905f998224063cca144cd43e3cb12e75cd9da94647b0cf7fa1f916dbdf97ad2"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.203423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.208159 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" event={"ID":"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b","Type":"ContainerStarted","Data":"20b30d97bd2bf4f62cc70d27b97d644abd4014bc18bf06f94a1c01d79b05cd86"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.208540 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.210212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" event={"ID":"deedb14e-007e-44eb-bd52-85bbc12d0bec","Type":"ContainerStarted","Data":"774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.214053 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" podStartSLOduration=3.265030453 podStartE2EDuration="18.214036779s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.882614458 +0000 UTC m=+1092.278668481" lastFinishedPulling="2026-03-08 19:50:05.831620794 +0000 UTC m=+1107.227674807" observedRunningTime="2026-03-08 19:50:07.183790274 +0000 UTC m=+1108.579844297" watchObservedRunningTime="2026-03-08 19:50:07.214036779 +0000 UTC m=+1108.610090802" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.217310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerStarted","Data":"98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.218420 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" podStartSLOduration=2.994140502 podStartE2EDuration="18.218410546s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.607885625 +0000 UTC m=+1092.003939648" lastFinishedPulling="2026-03-08 19:50:05.832155669 +0000 UTC m=+1107.228209692" observedRunningTime="2026-03-08 19:50:07.21743229 +0000 UTC m=+1108.613486303" watchObservedRunningTime="2026-03-08 19:50:07.218410546 +0000 UTC m=+1108.614464569" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.228149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" event={"ID":"d8de7df0-2dea-4d3c-a02e-57bfabade82f","Type":"ContainerStarted","Data":"84939956a7b32f2fe9fcbefc882d3c0bb4a64aea0fc0983b6dba16506bb63f3e"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.231395 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" podStartSLOduration=2.770019285 podStartE2EDuration="17.231376691s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.370036932 +0000 UTC m=+1092.766090955" lastFinishedPulling="2026-03-08 19:50:05.831394298 +0000 UTC m=+1107.227448361" observedRunningTime="2026-03-08 19:50:07.231118914 +0000 UTC m=+1108.627172937" watchObservedRunningTime="2026-03-08 19:50:07.231376691 +0000 UTC m=+1108.627430714" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.234364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" event={"ID":"27aa3877-54cd-414d-80a0-ab20a68ed535","Type":"ContainerStarted","Data":"3e8829a44679263cce90c1b420fd6e82935b6f4a3feb0a8d1b56c38352b9192f"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.235312 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.238365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" event={"ID":"4742ab81-6c6d-43c8-8025-6a656b8c40dc","Type":"ContainerStarted","Data":"c8ee5357d25f841988d89f06bddc6c262675e7c53a4eab8ed89bf0fad9d9f489"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.238565 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.241271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" event={"ID":"7180efa7-8d93-436e-8de2-78fe5c173843","Type":"ContainerStarted","Data":"4c0ae4b4ed1923b8ba336f1eeedd2c1bed8e648c010a94653c97bf8bde677286"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.241299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.252354 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" podStartSLOduration=3.115566675 podStartE2EDuration="18.252338899s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.695345434 +0000 UTC m=+1092.091399457" lastFinishedPulling="2026-03-08 19:50:05.832117658 +0000 UTC m=+1107.228171681" observedRunningTime="2026-03-08 19:50:07.251418624 +0000 UTC m=+1108.647472647" watchObservedRunningTime="2026-03-08 19:50:07.252338899 +0000 UTC m=+1108.648392922" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.253709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" event={"ID":"d5770638-6059-4ce5-b401-84b0155589a3","Type":"ContainerStarted","Data":"eed205ae70bf3237ab7511b8c025102e57d642a73384a727791f61d42998ce1b"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.254030 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.273472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" podStartSLOduration=3.692422999 podStartE2EDuration="18.273453771s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.250676914 +0000 UTC m=+1092.646730937" lastFinishedPulling="2026-03-08 19:50:05.831707686 +0000 UTC m=+1107.227761709" observedRunningTime="2026-03-08 19:50:07.273163494 +0000 UTC m=+1108.669217517" watchObservedRunningTime="2026-03-08 19:50:07.273453771 +0000 UTC m=+1108.669507794" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.309555 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" podStartSLOduration=2.893514292 podStartE2EDuration="17.309539242s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.420267978 +0000 UTC m=+1092.816322001" lastFinishedPulling="2026-03-08 19:50:05.836292928 +0000 UTC m=+1107.232346951" observedRunningTime="2026-03-08 19:50:07.306208362 +0000 UTC m=+1108.702262385" watchObservedRunningTime="2026-03-08 19:50:07.309539242 +0000 UTC m=+1108.705593265" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.328234 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" podStartSLOduration=3.748882433 podStartE2EDuration="18.328217389s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.250185542 +0000 UTC m=+1092.646239565" lastFinishedPulling="2026-03-08 19:50:05.829520498 +0000 UTC m=+1107.225574521" observedRunningTime="2026-03-08 19:50:07.325876176 +0000 UTC m=+1108.721930199" watchObservedRunningTime="2026-03-08 19:50:07.328217389 +0000 UTC m=+1108.724271412" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.340045 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" podStartSLOduration=3.617150966 podStartE2EDuration="18.340030123s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.108358727 +0000 UTC m=+1092.504412740" lastFinishedPulling="2026-03-08 19:50:05.831237874 +0000 UTC m=+1107.227291897" observedRunningTime="2026-03-08 19:50:07.338458301 +0000 UTC m=+1108.734512324" watchObservedRunningTime="2026-03-08 19:50:07.340030123 +0000 UTC m=+1108.736084146" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.353567 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" podStartSLOduration=3.622902409 podStartE2EDuration="18.353549873s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.101819173 +0000 UTC m=+1092.497873196" lastFinishedPulling="2026-03-08 19:50:05.832466637 +0000 UTC m=+1107.228520660" observedRunningTime="2026-03-08 19:50:07.35005044 +0000 UTC m=+1108.746104463" watchObservedRunningTime="2026-03-08 19:50:07.353549873 +0000 UTC m=+1108.749603886" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.391058 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" podStartSLOduration=3.673538278 podStartE2EDuration="18.391037771s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.114144211 +0000 UTC m=+1092.510198234" lastFinishedPulling="2026-03-08 19:50:05.831643704 +0000 UTC m=+1107.227697727" observedRunningTime="2026-03-08 19:50:07.37034501 +0000 UTC m=+1108.766399033" watchObservedRunningTime="2026-03-08 19:50:07.391037771 +0000 UTC m=+1108.787091794" Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.288773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" event={"ID":"deedb14e-007e-44eb-bd52-85bbc12d0bec","Type":"ContainerStarted","Data":"1de6408aa86d1456ffb2743f6af0c1c9552a4390f1967d3b4009755f799b5445"} Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.289179 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.329093 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" podStartSLOduration=21.329070968 podStartE2EDuration="21.329070968s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:50:11.327210719 +0000 UTC m=+1112.723264772" watchObservedRunningTime="2026-03-08 19:50:11.329070968 +0000 UTC m=+1112.725125021" Mar 08 19:50:12 crc kubenswrapper[4885]: I0308 19:50:12.298732 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerID="626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d" exitCode=0 Mar 08 19:50:12 crc kubenswrapper[4885]: I0308 19:50:12.298788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerDied","Data":"626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d"} Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.200444 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.277024 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"d8bff80c-e537-4de5-8a05-85ee81004c30\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.292810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7" (OuterVolumeSpecName: "kube-api-access-mqrg7") pod "d8bff80c-e537-4de5-8a05-85ee81004c30" (UID: "d8bff80c-e537-4de5-8a05-85ee81004c30"). InnerVolumeSpecName "kube-api-access-mqrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerDied","Data":"98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128"} Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317299 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317312 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.378407 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") on node \"crc\" DevicePath \"\"" Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.257143 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.263505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.380471 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" path="/var/lib/kubelet/pods/383ac947-e1b1-4f15-98a6-69fcc60e0ac1/volumes" Mar 08 19:50:16 crc kubenswrapper[4885]: I0308 19:50:16.472581 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.344197 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" event={"ID":"8d086566-6154-4ddd-8028-a9c203cfec11","Type":"ContainerStarted","Data":"f591fa6f264923752890fa4ae44758ccfaf811d915bddeca71453b99c70a74b1"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.344368 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.345764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" event={"ID":"d5136d34-82a8-47c5-9d7d-09e0206587e8","Type":"ContainerStarted","Data":"66f62958fa94c95d1652a06955146a918f1d708b09f0e339559030c8efdaa0d2"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.345982 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.347216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" event={"ID":"d8de7df0-2dea-4d3c-a02e-57bfabade82f","Type":"ContainerStarted","Data":"ca291ac12d55f85066620edd4ef44f4ffcf918f74c5aec4f42789ab398b6b7ae"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.347360 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.348661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" event={"ID":"bbb8966a-e61f-427d-af2a-0fdab2348d03","Type":"ContainerStarted","Data":"e4e20c6f22b6ee9b2d15293ce56766064b5dff883644ad78be143abdc41d3f8a"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.348850 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.350064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" event={"ID":"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6","Type":"ContainerStarted","Data":"c747dd03cf5be763fba30f34527af7a9aac5b8d21b4c9a2eea79a607c54c209d"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.350248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.351710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" event={"ID":"44fbac8d-d81f-4c03-9555-ef33551d478d","Type":"ContainerStarted","Data":"ff26c7b17bbe62f33310071f4c7349db2b732469daac3772bd36334e8b626c7c"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.351867 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.353084 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" event={"ID":"a8caa87f-832f-4436-beaa-aaa505de3bac","Type":"ContainerStarted","Data":"670157d30b97301a66a6648804293834912a5b9ecaa797a3887928d8d7c13573"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.364874 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podStartSLOduration=3.619793026 podStartE2EDuration="28.364854692s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.42819655 +0000 UTC m=+1092.824250573" lastFinishedPulling="2026-03-08 19:50:16.173258216 +0000 UTC m=+1117.569312239" observedRunningTime="2026-03-08 19:50:17.364251847 +0000 UTC m=+1118.760305880" watchObservedRunningTime="2026-03-08 19:50:17.364854692 +0000 UTC m=+1118.760908725" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.398878 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podStartSLOduration=2.80175534 podStartE2EDuration="27.398856418s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.522723436 +0000 UTC m=+1092.918777459" lastFinishedPulling="2026-03-08 19:50:16.119824514 +0000 UTC m=+1117.515878537" observedRunningTime="2026-03-08 19:50:17.386324274 +0000 UTC m=+1118.782378327" watchObservedRunningTime="2026-03-08 19:50:17.398856418 +0000 UTC m=+1118.794910441" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.428268 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" podStartSLOduration=19.510534828 podStartE2EDuration="28.42824979s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:50:06.692051736 +0000 UTC m=+1108.088105759" lastFinishedPulling="2026-03-08 19:50:15.609766698 +0000 UTC m=+1117.005820721" observedRunningTime="2026-03-08 19:50:17.421618774 +0000 UTC m=+1118.817672797" watchObservedRunningTime="2026-03-08 19:50:17.42824979 +0000 UTC m=+1118.824303833" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.445349 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podStartSLOduration=3.759896765 podStartE2EDuration="28.445329175s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.43122995 +0000 UTC m=+1092.827283973" lastFinishedPulling="2026-03-08 19:50:16.11666235 +0000 UTC m=+1117.512716383" observedRunningTime="2026-03-08 19:50:17.440380663 +0000 UTC m=+1118.836434686" watchObservedRunningTime="2026-03-08 19:50:17.445329175 +0000 UTC m=+1118.841383198" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.452766 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podStartSLOduration=2.853608609 podStartE2EDuration="27.452751892s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.528632953 +0000 UTC m=+1092.924686976" lastFinishedPulling="2026-03-08 19:50:16.127776236 +0000 UTC m=+1117.523830259" observedRunningTime="2026-03-08 19:50:17.451651223 +0000 UTC m=+1118.847705246" watchObservedRunningTime="2026-03-08 19:50:17.452751892 +0000 UTC m=+1118.848805915" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.476806 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podStartSLOduration=2.7148285960000003 podStartE2EDuration="27.476792582s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.388723269 +0000 UTC m=+1092.784777292" lastFinishedPulling="2026-03-08 19:50:16.150687245 +0000 UTC m=+1117.546741278" observedRunningTime="2026-03-08 19:50:17.471752738 +0000 UTC m=+1118.867806751" watchObservedRunningTime="2026-03-08 19:50:17.476792582 +0000 UTC m=+1118.872846605" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.488598 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podStartSLOduration=3.754988474 podStartE2EDuration="28.488582516s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.398267693 +0000 UTC m=+1092.794321706" lastFinishedPulling="2026-03-08 19:50:16.131861715 +0000 UTC m=+1117.527915748" observedRunningTime="2026-03-08 19:50:17.487378594 +0000 UTC m=+1118.883432617" watchObservedRunningTime="2026-03-08 19:50:17.488582516 +0000 UTC m=+1118.884636539" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.363814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" event={"ID":"392750e0-9d71-418d-89b0-ec10f33ec505","Type":"ContainerStarted","Data":"17bfd1b3d6efaebabb3eb82c3390d6a76923e13ad648aa637d5eb220eb4f8ad5"} Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.364239 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.365605 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" event={"ID":"157555d5-ca64-49f8-8849-cd763c83feda","Type":"ContainerStarted","Data":"3d054c295deed751d93f3a8878dc5591a1d6cc8c70fe4d9f2643ee5936e02417"} Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.411705 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podStartSLOduration=2.793261297 podStartE2EDuration="29.411677496s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.209971951 +0000 UTC m=+1092.606025974" lastFinishedPulling="2026-03-08 19:50:17.82838815 +0000 UTC m=+1119.224442173" observedRunningTime="2026-03-08 19:50:18.407133945 +0000 UTC m=+1119.803187968" watchObservedRunningTime="2026-03-08 19:50:18.411677496 +0000 UTC m=+1119.807731539" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.435674 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podStartSLOduration=2.708377757 podStartE2EDuration="29.435655974s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.050946969 +0000 UTC m=+1092.447000992" lastFinishedPulling="2026-03-08 19:50:17.778225186 +0000 UTC m=+1119.174279209" observedRunningTime="2026-03-08 19:50:18.427809015 +0000 UTC m=+1119.823863038" watchObservedRunningTime="2026-03-08 19:50:18.435655974 +0000 UTC m=+1119.831710017" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.053250 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.079023 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.094033 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.131482 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.190272 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.206436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.251959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.267942 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.343583 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.358884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.535916 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.605290 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.389648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" event={"ID":"7c05f3ed-fe8f-47db-b596-8b90b96c295c","Type":"ContainerStarted","Data":"5c6ba672b42f96c4f3d5a647278db02dfc5a68988906e88606385ec526358e27"} Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.390764 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.423819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podStartSLOduration=2.831358272 podStartE2EDuration="32.423790219s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.210916427 +0000 UTC m=+1092.606970460" lastFinishedPulling="2026-03-08 19:50:20.803348374 +0000 UTC m=+1122.199402407" observedRunningTime="2026-03-08 19:50:21.417926593 +0000 UTC m=+1122.813980646" watchObservedRunningTime="2026-03-08 19:50:21.423790219 +0000 UTC m=+1122.819844282" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.896197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.910288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.025502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.323740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:50:22 crc kubenswrapper[4885]: W0308 19:50:22.328446 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc40f07_4706_4008_b86e_e73a2f2ab620.slice/crio-0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d WatchSource:0}: Error finding container 0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d: Status 404 returned error can't find the container with id 0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.398563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" event={"ID":"9fc40f07-4706-4008-b86e-e73a2f2ab620","Type":"ContainerStarted","Data":"0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d"} Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.425903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" event={"ID":"9fc40f07-4706-4008-b86e-e73a2f2ab620","Type":"ContainerStarted","Data":"0951ee5f7648c1b7f86e13f2f59910707cb8c3155eca01ccbb648fa4dec6b6c0"} Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.426629 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.452725 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" podStartSLOduration=33.774405279 podStartE2EDuration="36.452704836s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:50:22.331459139 +0000 UTC m=+1123.727513172" lastFinishedPulling="2026-03-08 19:50:25.009758706 +0000 UTC m=+1126.405812729" observedRunningTime="2026-03-08 19:50:25.450713183 +0000 UTC m=+1126.846767236" watchObservedRunningTime="2026-03-08 19:50:25.452704836 +0000 UTC m=+1126.848758859" Mar 08 19:50:26 crc kubenswrapper[4885]: I0308 19:50:26.089088 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.270695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.443969 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.451556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.475996 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.520747 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.525261 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.679162 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.781001 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.035340 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818443 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818826 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818918 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.819785 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.819873 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" gracePeriod=600 Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.530565 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" exitCode=0 Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.530651 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.531905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.532088 4885 scope.go:117] "RemoveContainer" containerID="f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" Mar 08 19:50:46 crc kubenswrapper[4885]: I0308 19:50:46.063504 4885 scope.go:117] "RemoveContainer" containerID="b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.394734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:49 crc kubenswrapper[4885]: E0308 19:50:49.395894 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.395911 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.396110 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.397102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.400813 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.400893 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.401450 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cjg8w" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.419102 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.433435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.496035 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.498606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.500241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.508423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.531566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.531648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632786 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.633823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.651637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735276 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735989 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.752615 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.818468 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:50 crc kubenswrapper[4885]: W0308 19:50:50.170025 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19de3581_aa8c_48c0_aad6_4139d132ca70.slice/crio-883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766 WatchSource:0}: Error finding container 883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766: Status 404 returned error can't find the container with id 883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766 Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.171235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.237814 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.692032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" event={"ID":"19de3581-aa8c-48c0-aad6-4139d132ca70","Type":"ContainerStarted","Data":"883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766"} Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.693260 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" event={"ID":"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0","Type":"ContainerStarted","Data":"b81b2d09cf554bd40279185ed607a08f534a46ffc9af81dd3b055b475415182a"} Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.473067 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.498029 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.499290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.508858 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.661774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.661858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.662005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.764453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.764990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.793703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.822354 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.089601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.306988 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.334628 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.335971 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.341904 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.472975 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.473310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.473411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.574967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.576027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.612833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.638800 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.639841 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646445 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646516 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646650 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646727 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8v7l4" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646746 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646807 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.651476 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.665343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.666164 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.752661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerStarted","Data":"abe3994c827c15ee527cc92242a1479dbff3de65c1e91c72ef3fd14c10326728"} Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781149 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781223 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781238 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781347 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781372 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.882801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883152 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.885023 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.885967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.889620 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.889807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.898177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.898716 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.905355 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.931067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.979172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: W0308 19:50:52.996726 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2196a973_cc42_4633_8c99_1422d07d475a.slice/crio-ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56 WatchSource:0}: Error finding container ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56: Status 404 returned error can't find the container with id ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56 Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.005300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.471598 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.472963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475696 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475741 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475701 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7bqmj" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475737 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475853 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.482126 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.494654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.504756 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: W0308 19:50:53.513036 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01dc1fd5_4e2f_4129_9452_ed50fa1d182b.slice/crio-8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc WatchSource:0}: Error finding container 8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc: Status 404 returned error can't find the container with id 8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595164 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595236 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595280 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595340 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595612 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.702088 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.702213 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.703162 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.703717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.715952 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.717614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.718563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.722833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.726826 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.731452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.751429 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.772804 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerStarted","Data":"ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56"} Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.778793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc"} Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.804951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.380934 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.791722 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"654fe72412f8a73fefea3c7f4b820f3cc3985166e76d795cc9ca36d6cf741354"} Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.801874 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.803455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807237 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807460 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807568 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-f5npv" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.810170 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.811383 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916667 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018412 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018472 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018687 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.020450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.025171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.025375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.035926 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.038400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.129428 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.688520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:55 crc kubenswrapper[4885]: W0308 19:50:55.711125 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f52f98_0e26_4fc1_a9af_f580531f8550.slice/crio-c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5 WatchSource:0}: Error finding container c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5: Status 404 returned error can't find the container with id c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5 Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.800575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5"} Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.158909 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.160762 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164060 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5kqpk" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164842 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.168687 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354248 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354614 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.355233 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.355797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.356008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.356523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.366474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.366488 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.370934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.398577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.459762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.460601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465362 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465587 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-cc744" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.474403 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.481184 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558371 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558441 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659662 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659706 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.662049 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.662447 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.667907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.668331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.674004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.832249 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.640872 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.641739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.643320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dsb9r" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.656597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.791499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.892745 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.917801 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.991650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.055700 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.057412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063255 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063458 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063555 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063691 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-swjjk" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.071423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144522 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144646 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144678 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144773 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.145022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.145123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246897 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.247393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.248052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.250541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.252976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.253062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.266504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.269044 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.269905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.395104 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.972966 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.974246 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.978231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.982470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.989260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-47w7f" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.989782 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071283 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071370 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071485 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.096402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.100837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.105880 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173197 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173306 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173906 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173993 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.174168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.175994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.182200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.194651 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.195833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274753 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274877 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.276521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.289655 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.362365 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.417534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.545030 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.546554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.570729 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.571168 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5wc4v" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.572189 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.577572 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.583443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.617393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.617472 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.618853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619356 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619472 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.720930 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.720997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721585 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.726588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.726869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.729041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.730469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.739573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.744299 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.747570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.887493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.519395 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.520248 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mznsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79f9fc56ff-vwdgd_openstack(604f26ec-2884-4eb6-97f9-e2961f8907b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.521615 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.523350 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.523598 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nskr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-86zq7_openstack(2196a973-cc42-4633-8c99-1422d07d475a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.525876 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.567383 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.567567 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfmpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-442qz_openstack(2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.568946 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" podUID="2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.626788 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.627146 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg25w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-w67fx_openstack(19de3581-aa8c-48c0-aad6-4139d132ca70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.628363 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" podUID="19de3581-aa8c-48c0-aad6-4139d132ca70" Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.878640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 19:51:12 crc kubenswrapper[4885]: W0308 19:51:12.888328 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1d62ba_4033_4906_87c1_d673c1ab8637.slice/crio-d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065 WatchSource:0}: Error finding container d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065: Status 404 returned error can't find the container with id d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065 Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.946836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerStarted","Data":"d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065"} Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.949199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.950957 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.951339 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.041443 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f2f07f_efc4_4778_944c_d4819f0b0e30.slice/crio-3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29 WatchSource:0}: Error finding container 3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29: Status 404 returned error can't find the container with id 3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29 Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.044030 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.085557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.180536 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.215206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.267694 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.268872 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd768ed9e_b089_4308_befc_e3bd6aa68683.slice/crio-d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8 WatchSource:0}: Error finding container d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8: Status 404 returned error can't find the container with id d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8 Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.365622 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod925797ff_e1b0_4df7_83db_2091264a4bb8.slice/crio-9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651 WatchSource:0}: Error finding container 9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651: Status 404 returned error can't find the container with id 9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651 Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.652758 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.662843 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776142 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"19de3581-aa8c-48c0-aad6-4139d132ca70\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776539 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776631 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776657 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"19de3581-aa8c-48c0-aad6-4139d132ca70\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.777751 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config" (OuterVolumeSpecName: "config") pod "19de3581-aa8c-48c0-aad6-4139d132ca70" (UID: "19de3581-aa8c-48c0-aad6-4139d132ca70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.778147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config" (OuterVolumeSpecName: "config") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.778283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.779909 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w" (OuterVolumeSpecName: "kube-api-access-xg25w") pod "19de3581-aa8c-48c0-aad6-4139d132ca70" (UID: "19de3581-aa8c-48c0-aad6-4139d132ca70"). InnerVolumeSpecName "kube-api-access-xg25w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.785205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl" (OuterVolumeSpecName: "kube-api-access-jfmpl") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "kube-api-access-jfmpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878180 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878209 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878220 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878229 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878239 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.954064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.960272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.962874 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.962885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" event={"ID":"19de3581-aa8c-48c0-aad6-4139d132ca70","Type":"ContainerDied","Data":"883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.963854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"27094a5eecfea3bd81d2314594b8cfdb03f329abe60f33c847c8c969d4747a0d"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.965137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.966174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerStarted","Data":"3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.967669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.967695 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.968524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerStarted","Data":"9c43acfe462c14a8f2d4b9a160251e927b7f46fa5baa4a259edc59c61d9dcadd"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.969709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" event={"ID":"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0","Type":"ContainerDied","Data":"b81b2d09cf554bd40279185ed607a08f534a46ffc9af81dd3b055b475415182a"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.969733 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.994734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8"} Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.114976 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.125760 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.148047 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.156385 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.004394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"e8b9e6003711ba0073f7cced036d1550a5aac01aa3276b7f4a1f8ca2c14ba942"} Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.379159 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19de3581-aa8c-48c0-aad6-4139d132ca70" path="/var/lib/kubelet/pods/19de3581-aa8c-48c0-aad6-4139d132ca70/volumes" Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.379690 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" path="/var/lib/kubelet/pods/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0/volumes" Mar 08 19:51:18 crc kubenswrapper[4885]: I0308 19:51:18.035409 4885 generic.go:334] "Generic (PLEG): container finished" podID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" exitCode=0 Mar 08 19:51:18 crc kubenswrapper[4885]: I0308 19:51:18.035512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} Mar 08 19:51:21 crc kubenswrapper[4885]: I0308 19:51:21.538624 4885 generic.go:334] "Generic (PLEG): container finished" podID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerID="81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556" exitCode=0 Mar 08 19:51:21 crc kubenswrapper[4885]: I0308 19:51:21.538733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.565692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.572969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerStarted","Data":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.573083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.576639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.578316 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba" exitCode=0 Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.578357 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.580707 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerStarted","Data":"40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.580831 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.582672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.587743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerStarted","Data":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.588648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.593915 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.756493401 podStartE2EDuration="31.593894371s" podCreationTimestamp="2026-03-08 19:50:53 +0000 UTC" firstStartedPulling="2026-03-08 19:50:55.713599376 +0000 UTC m=+1157.109653399" lastFinishedPulling="2026-03-08 19:51:12.551000346 +0000 UTC m=+1173.947054369" observedRunningTime="2026-03-08 19:51:24.589193276 +0000 UTC m=+1185.985247309" watchObservedRunningTime="2026-03-08 19:51:24.593894371 +0000 UTC m=+1185.989948404" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.608155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.609656 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.60962935 podStartE2EDuration="29.60962935s" podCreationTimestamp="2026-03-08 19:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:24.606598889 +0000 UTC m=+1186.002652912" watchObservedRunningTime="2026-03-08 19:51:24.60962935 +0000 UTC m=+1186.005683373" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.653304 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.357311555 podStartE2EDuration="26.653265682s" podCreationTimestamp="2026-03-08 19:50:58 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.048139508 +0000 UTC m=+1174.444193531" lastFinishedPulling="2026-03-08 19:51:23.344093605 +0000 UTC m=+1184.740147658" observedRunningTime="2026-03-08 19:51:24.646366049 +0000 UTC m=+1186.042420062" watchObservedRunningTime="2026-03-08 19:51:24.653265682 +0000 UTC m=+1186.049319705" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.672819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.881367494 podStartE2EDuration="28.672802021s" podCreationTimestamp="2026-03-08 19:50:56 +0000 UTC" firstStartedPulling="2026-03-08 19:51:12.890329558 +0000 UTC m=+1174.286383581" lastFinishedPulling="2026-03-08 19:51:22.681764065 +0000 UTC m=+1184.077818108" observedRunningTime="2026-03-08 19:51:24.666410562 +0000 UTC m=+1186.062464585" watchObservedRunningTime="2026-03-08 19:51:24.672802021 +0000 UTC m=+1186.068856044" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.692936 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mn4lz" podStartSLOduration=12.517301673 podStartE2EDuration="22.692904647s" podCreationTimestamp="2026-03-08 19:51:02 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.084813884 +0000 UTC m=+1174.480867907" lastFinishedPulling="2026-03-08 19:51:23.260416858 +0000 UTC m=+1184.656470881" observedRunningTime="2026-03-08 19:51:24.682228883 +0000 UTC m=+1186.078282926" watchObservedRunningTime="2026-03-08 19:51:24.692904647 +0000 UTC m=+1186.088958680" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.131049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.131092 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.629271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1"} Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.630254 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f"} Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.658963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pp4rs" podStartSLOduration=13.42910377 podStartE2EDuration="22.65894111s" podCreationTimestamp="2026-03-08 19:51:03 +0000 UTC" firstStartedPulling="2026-03-08 19:51:14.029469448 +0000 UTC m=+1175.425523471" lastFinishedPulling="2026-03-08 19:51:23.259306788 +0000 UTC m=+1184.655360811" observedRunningTime="2026-03-08 19:51:25.652027115 +0000 UTC m=+1187.048081148" watchObservedRunningTime="2026-03-08 19:51:25.65894111 +0000 UTC m=+1187.054995143" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.482711 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.483086 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.643144 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.643413 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:28 crc kubenswrapper[4885]: E0308 19:51:28.604418 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:54714->38.102.83.80:33667: write tcp 38.102.83.80:54714->38.102.83.80:33667: write: broken pipe Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.659819 4885 generic.go:334] "Generic (PLEG): container finished" podID="2196a973-cc42-4633-8c99-1422d07d475a" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" exitCode=0 Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.659975 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.662726 4885 generic.go:334] "Generic (PLEG): container finished" podID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" exitCode=0 Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.662808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.666092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.670742 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.720171 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.158897603 podStartE2EDuration="27.720153411s" podCreationTimestamp="2026-03-08 19:51:01 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.273859756 +0000 UTC m=+1174.669913779" lastFinishedPulling="2026-03-08 19:51:27.835115564 +0000 UTC m=+1189.231169587" observedRunningTime="2026-03-08 19:51:28.713037222 +0000 UTC m=+1190.109091255" watchObservedRunningTime="2026-03-08 19:51:28.720153411 +0000 UTC m=+1190.116207434" Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.754127 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.3646741 podStartE2EDuration="24.754109905s" podCreationTimestamp="2026-03-08 19:51:04 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.465951659 +0000 UTC m=+1174.862005692" lastFinishedPulling="2026-03-08 19:51:27.855387474 +0000 UTC m=+1189.251441497" observedRunningTime="2026-03-08 19:51:28.743751799 +0000 UTC m=+1190.139805822" watchObservedRunningTime="2026-03-08 19:51:28.754109905 +0000 UTC m=+1190.150163928" Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.997587 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.163979 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.270604 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.401330 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.455516 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.699720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerStarted","Data":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.699970 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.704270 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerStarted","Data":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.705183 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.731212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podStartSLOduration=2.922839131 podStartE2EDuration="37.731183042s" podCreationTimestamp="2026-03-08 19:50:52 +0000 UTC" firstStartedPulling="2026-03-08 19:50:53.002169837 +0000 UTC m=+1154.398223860" lastFinishedPulling="2026-03-08 19:51:27.810513758 +0000 UTC m=+1189.206567771" observedRunningTime="2026-03-08 19:51:29.722636464 +0000 UTC m=+1191.118690557" watchObservedRunningTime="2026-03-08 19:51:29.731183042 +0000 UTC m=+1191.127237105" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.757446 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podStartSLOduration=3.101946319 podStartE2EDuration="38.75741898s" podCreationTimestamp="2026-03-08 19:50:51 +0000 UTC" firstStartedPulling="2026-03-08 19:50:52.104046171 +0000 UTC m=+1153.500100194" lastFinishedPulling="2026-03-08 19:51:27.759518802 +0000 UTC m=+1189.155572855" observedRunningTime="2026-03-08 19:51:29.747478506 +0000 UTC m=+1191.143532559" watchObservedRunningTime="2026-03-08 19:51:29.75741898 +0000 UTC m=+1191.153473043" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.790118 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.888226 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.970657 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.061626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.081899 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.087858 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.091478 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.095367 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.096352 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.106468 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.121715 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.130217 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194676 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194942 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296481 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296614 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296659 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.297381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298203 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.299043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.299115 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.302570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.315456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.315501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.320894 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.412276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.431026 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.474310 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.513448 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.515203 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.518426 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.524325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602496 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719156 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719291 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720466 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720896 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.721071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.739516 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748618 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748882 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748629 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" containerID="cri-o://d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" gracePeriod=10 Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.792655 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.835068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.845124 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.908063 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.015400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.016937 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019750 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cmhs5" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019887 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019996 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.020170 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.041325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133398 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.156357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.234765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.234880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235304 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235388 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.236423 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.237157 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.237833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.239553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.239667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.240794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.241009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj" (OuterVolumeSpecName: "kube-api-access-mznsj") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "kube-api-access-mznsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.249991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.273483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config" (OuterVolumeSpecName: "config") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.275086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.279011 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.337985 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.338231 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.338241 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.342775 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:31 crc kubenswrapper[4885]: W0308 19:51:31.343652 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201fa134_20f7_4902_8fd4_ba352e7f4e95.slice/crio-4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549 WatchSource:0}: Error finding container 4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549: Status 404 returned error can't find the container with id 4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.369161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.385881 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.667206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758563 4885 generic.go:334] "Generic (PLEG): container finished" podID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758647 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758679 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"abe3994c827c15ee527cc92242a1479dbff3de65c1e91c72ef3fd14c10326728"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758716 4885 scope.go:117] "RemoveContainer" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.763036 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerStarted","Data":"e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.763065 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerStarted","Data":"35bbd23060f341a3429a5bab6384434330b103927103bf0acea28883cf67dc65"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765421 4885 generic.go:334] "Generic (PLEG): container finished" podID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerID="e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765476 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerStarted","Data":"4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.767971 4885 generic.go:334] "Generic (PLEG): container finished" podID="f432b919-7772-41bf-9113-94eefe45e347" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.768022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.768185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerStarted","Data":"472650bb697781b76975122a9d79fd5f4e880cc3eb0edd5811698f77d98f72e5"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.769468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"cd1df5e26dfde01021643639b3d30a9000c123fa83c48692684173ba1b046531"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.769612 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" containerID="cri-o://bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" gracePeriod=10 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.785686 4885 scope.go:117] "RemoveContainer" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.785609 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.791789 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.815822 4885 scope.go:117] "RemoveContainer" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: E0308 19:51:31.816175 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": container with ID starting with d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf not found: ID does not exist" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816214 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} err="failed to get container status \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": rpc error: code = NotFound desc = could not find container \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": container with ID starting with d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf not found: ID does not exist" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816236 4885 scope.go:117] "RemoveContainer" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: E0308 19:51:31.816479 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": container with ID starting with fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030 not found: ID does not exist" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816733 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030"} err="failed to get container status \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": rpc error: code = NotFound desc = could not find container \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": container with ID starting with fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030 not found: ID does not exist" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.833232 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.844789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5qsh8" podStartSLOduration=1.8447738089999999 podStartE2EDuration="1.844773809s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:31.836476798 +0000 UTC m=+1193.232530831" watchObservedRunningTime="2026-03-08 19:51:31.844773809 +0000 UTC m=+1193.240827832" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124260 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.124826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="init" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="init" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.124874 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124881 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.125028 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.125565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.127226 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.130479 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.131193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.137134 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159410 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159546 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.172775 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.197798 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261445 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.262212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.262494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.263947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.263888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.264560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.266270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr" (OuterVolumeSpecName: "kube-api-access-5nskr") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "kube-api-access-5nskr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.276622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.276689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.301166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config" (OuterVolumeSpecName: "config") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.327366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365148 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365170 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365180 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.456271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.485835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.810750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerStarted","Data":"b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.811175 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.818761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerStarted","Data":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.819130 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822376 4885 generic.go:334] "Generic (PLEG): container finished" podID="2196a973-cc42-4633-8c99-1422d07d475a" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" exitCode=0 Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822762 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822957 4885 scope.go:117] "RemoveContainer" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.840673 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podStartSLOduration=2.840655456 podStartE2EDuration="2.840655456s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:32.830474876 +0000 UTC m=+1194.226528899" watchObservedRunningTime="2026-03-08 19:51:32.840655456 +0000 UTC m=+1194.236709479" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.856175 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" podStartSLOduration=2.856157919 podStartE2EDuration="2.856157919s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:32.852278595 +0000 UTC m=+1194.248332618" watchObservedRunningTime="2026-03-08 19:51:32.856157919 +0000 UTC m=+1194.252211942" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.876667 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.882577 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.889913 4885 scope.go:117] "RemoveContainer" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.911751 4885 scope.go:117] "RemoveContainer" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.912112 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": container with ID starting with bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b not found: ID does not exist" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912134 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} err="failed to get container status \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": rpc error: code = NotFound desc = could not find container \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": container with ID starting with bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b not found: ID does not exist" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912152 4885 scope.go:117] "RemoveContainer" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.912467 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": container with ID starting with 3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08 not found: ID does not exist" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912501 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08"} err="failed to get container status \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": rpc error: code = NotFound desc = could not find container \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": container with ID starting with 3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08 not found: ID does not exist" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.173404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.183235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:33 crc kubenswrapper[4885]: W0308 19:51:33.191757 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f0edc25_2cc1_4111_96e3_3807e6463d57.slice/crio-656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7 WatchSource:0}: Error finding container 656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7: Status 404 returned error can't find the container with id 656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7 Mar 08 19:51:33 crc kubenswrapper[4885]: W0308 19:51:33.196628 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f704685_800d_4386_a47d_8c60b0885aca.slice/crio-8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17 WatchSource:0}: Error finding container 8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17: Status 404 returned error can't find the container with id 8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.379870 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2196a973-cc42-4633-8c99-1422d07d475a" path="/var/lib/kubelet/pods/2196a973-cc42-4633-8c99-1422d07d475a/volumes" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.381661 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" path="/var/lib/kubelet/pods/604f26ec-2884-4eb6-97f9-e2961f8907b1/volumes" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.760758 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:33 crc kubenswrapper[4885]: E0308 19:51:33.761067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761083 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: E0308 19:51:33.761103 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="init" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761109 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="init" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.764306 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.811285 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834009 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerID="e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb" exitCode=0 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerDied","Data":"e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerStarted","Data":"656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.835949 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f704685-800d-4386-a47d-8c60b0885aca" containerID="3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81" exitCode=0 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.836020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerDied","Data":"3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.836088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerStarted","Data":"8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839298 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839380 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.894358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.894449 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.907739 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.902105392 podStartE2EDuration="3.907714758s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="2026-03-08 19:51:31.686657761 +0000 UTC m=+1193.082711784" lastFinishedPulling="2026-03-08 19:51:32.692267137 +0000 UTC m=+1194.088321150" observedRunningTime="2026-03-08 19:51:33.882651461 +0000 UTC m=+1195.278705494" watchObservedRunningTime="2026-03-08 19:51:33.907714758 +0000 UTC m=+1195.303768821" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.996556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.997471 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.998078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.019305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.078577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.337214 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858329 4885 generic.go:334] "Generic (PLEG): container finished" podID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerID="d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17" exitCode=0 Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerDied","Data":"d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17"} Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerStarted","Data":"4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.305439 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.323767 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.424733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"0f704685-800d-4386-a47d-8c60b0885aca\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425016 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"5f0edc25-2cc1-4111-96e3-3807e6463d57\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"0f704685-800d-4386-a47d-8c60b0885aca\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"5f0edc25-2cc1-4111-96e3-3807e6463d57\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425560 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f704685-800d-4386-a47d-8c60b0885aca" (UID: "0f704685-800d-4386-a47d-8c60b0885aca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425702 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f0edc25-2cc1-4111-96e3-3807e6463d57" (UID: "5f0edc25-2cc1-4111-96e3-3807e6463d57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.426406 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.426491 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.431596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg" (OuterVolumeSpecName: "kube-api-access-vrjtg") pod "5f0edc25-2cc1-4111-96e3-3807e6463d57" (UID: "5f0edc25-2cc1-4111-96e3-3807e6463d57"). InnerVolumeSpecName "kube-api-access-vrjtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.436879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr" (OuterVolumeSpecName: "kube-api-access-t7cjr") pod "0f704685-800d-4386-a47d-8c60b0885aca" (UID: "0f704685-800d-4386-a47d-8c60b0885aca"). InnerVolumeSpecName "kube-api-access-t7cjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.529296 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.529616 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.870967 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerDied","Data":"656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.871023 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.871052 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerDied","Data":"8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875459 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875587 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.310549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.447580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.447667 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.450261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83db30de-c4aa-4a2f-9e5f-e4545e4ff475" (UID: "83db30de-c4aa-4a2f-9e5f-e4545e4ff475"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.459213 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj" (OuterVolumeSpecName: "kube-api-access-vppxj") pod "83db30de-c4aa-4a2f-9e5f-e4545e4ff475" (UID: "83db30de-c4aa-4a2f-9e5f-e4545e4ff475"). InnerVolumeSpecName "kube-api-access-vppxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.551316 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.551372 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884485 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerDied","Data":"4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c"} Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884970 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884576 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310395 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310802 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310829 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310851 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310864 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310902 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310913 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311133 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311150 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311163 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311917 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.317823 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.319776 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zffrj" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.320401 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475274 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.577957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.583295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.593385 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.593713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.602506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.649856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.963425 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.965002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.985712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.001103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.001189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.074443 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.075657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.081908 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.082893 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103377 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103875 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.128499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.185320 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.186432 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.197718 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.204875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.204946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.205124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.205181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.206270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.226772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.241363 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:38 crc kubenswrapper[4885]: W0308 19:51:38.247619 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod618b5189_8b29_473f_b59c_e911fca71041.slice/crio-6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39 WatchSource:0}: Error finding container 6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39: Status 404 returned error can't find the container with id 6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39 Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.276229 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.277139 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.280737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.295968 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.299423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.305881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.305948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.306051 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.306235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.307882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.326484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.391047 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.407916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.408105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.409661 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.424841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.507705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.594910 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.729412 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.853626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.921839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerStarted","Data":"9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.934848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerStarted","Data":"386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.936239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerStarted","Data":"6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.956659 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.021337 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.021608 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" containerID="cri-o://e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" gracePeriod=10 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.030059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.045272 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.046742 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.060913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.117346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118682 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234855 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234986 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.236028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.236739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.238046 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.239205 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.445009 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.534555 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.725793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.847989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848089 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848213 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.855102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v" (OuterVolumeSpecName: "kube-api-access-fsn2v") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "kube-api-access-fsn2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.888942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config" (OuterVolumeSpecName: "config") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.889087 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.905121 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950002 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950045 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950062 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950076 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958707 4885 generic.go:334] "Generic (PLEG): container finished" podID="f432b919-7772-41bf-9113-94eefe45e347" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958767 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"472650bb697781b76975122a9d79fd5f4e880cc3eb0edd5811698f77d98f72e5"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958811 4885 scope.go:117] "RemoveContainer" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958942 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966665 4885 generic.go:334] "Generic (PLEG): container finished" podID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerID="ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerDied","Data":"ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerStarted","Data":"0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.968790 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerID="b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.968826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerDied","Data":"b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974660 4885 generic.go:334] "Generic (PLEG): container finished" podID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerID="57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerDied","Data":"57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerStarted","Data":"bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.980188 4885 generic.go:334] "Generic (PLEG): container finished" podID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerID="2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.980263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerDied","Data":"2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13"} Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.034776 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.074564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.080650 4885 scope.go:117] "RemoveContainer" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.082071 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.102448 4885 scope.go:117] "RemoveContainer" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.104440 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": container with ID starting with e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2 not found: ID does not exist" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104488 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} err="failed to get container status \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": rpc error: code = NotFound desc = could not find container \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": container with ID starting with e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2 not found: ID does not exist" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104516 4885 scope.go:117] "RemoveContainer" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.104860 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": container with ID starting with fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0 not found: ID does not exist" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104902 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0"} err="failed to get container status \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": rpc error: code = NotFound desc = could not find container \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": container with ID starting with fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0 not found: ID does not exist" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.200710 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.201055 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201067 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.201089 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="init" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201115 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="init" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201268 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.209510 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.213899 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214155 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214373 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nbh5b" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.226080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.229736 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.235526 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355510 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355665 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458897 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459041 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459067 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459115 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:40.959098016 +0000 UTC m=+1202.355152039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.459200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.459234 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.467304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.476943 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.516549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.806718 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.807852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811316 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811509 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.825310 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.834458 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.837627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fffzj ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fffzj ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-rqttf" podUID="f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.844485 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.845907 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.849108 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.863327 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967580 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967821 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967894 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968116 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970219 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970261 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970316 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:41.970297302 +0000 UTC m=+1203.366351325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.989688 4885 generic.go:334] "Generic (PLEG): container finished" podID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerID="b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1" exitCode=0 Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.990177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.989888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1"} Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.990766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerStarted","Data":"e012096eca7d05b70c67ac3ab0c256b21f51521655fd61aa44ededf7f87ad72c"} Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.002964 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.069850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070229 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070264 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070328 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070402 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070423 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.073820 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.074860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.075532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.076441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.079065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.083099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.088642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.090158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.091510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.103356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.166554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.171370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.171556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.173133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.174484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts" (OuterVolumeSpecName: "scripts") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.272748 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273071 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273130 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273703 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273714 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.274846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.280233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.280420 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj" (OuterVolumeSpecName: "kube-api-access-fffzj") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "kube-api-access-fffzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.281033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.289073 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.313542 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375247 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375279 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375289 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375299 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375307 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.426668 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" path="/var/lib/kubelet/pods/83db30de-c4aa-4a2f-9e5f-e4545e4ff475/volumes" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.427318 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f432b919-7772-41bf-9113-94eefe45e347" path="/var/lib/kubelet/pods/f432b919-7772-41bf-9113-94eefe45e347/volumes" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.482435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"f7884923-e1d5-4b4d-a285-680bfbe38277\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.482499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"f7884923-e1d5-4b4d-a285-680bfbe38277\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.496180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz" (OuterVolumeSpecName: "kube-api-access-nxtkz") pod "f7884923-e1d5-4b4d-a285-680bfbe38277" (UID: "f7884923-e1d5-4b4d-a285-680bfbe38277"). InnerVolumeSpecName "kube-api-access-nxtkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.496314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7884923-e1d5-4b4d-a285-680bfbe38277" (UID: "f7884923-e1d5-4b4d-a285-680bfbe38277"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.522217 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.530650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.546553 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.596148 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.596191 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698455 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fff4a7a-1b14-4e29-8c84-d7fc55de879c" (UID: "6fff4a7a-1b14-4e29-8c84-d7fc55de879c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699456 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b3418f5-a92a-4fe6-b0ea-929b54ecb052" (UID: "8b3418f5-a92a-4fe6-b0ea-929b54ecb052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "761f5c93-2ed3-43f0-acaf-ee92d0719ec3" (UID: "761f5c93-2ed3-43f0-acaf-ee92d0719ec3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.702007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9" (OuterVolumeSpecName: "kube-api-access-hqnf9") pod "6fff4a7a-1b14-4e29-8c84-d7fc55de879c" (UID: "6fff4a7a-1b14-4e29-8c84-d7fc55de879c"). InnerVolumeSpecName "kube-api-access-hqnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.703085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf" (OuterVolumeSpecName: "kube-api-access-g56sf") pod "761f5c93-2ed3-43f0-acaf-ee92d0719ec3" (UID: "761f5c93-2ed3-43f0-acaf-ee92d0719ec3"). InnerVolumeSpecName "kube-api-access-g56sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.703587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p" (OuterVolumeSpecName: "kube-api-access-fp66p") pod "8b3418f5-a92a-4fe6-b0ea-929b54ecb052" (UID: "8b3418f5-a92a-4fe6-b0ea-929b54ecb052"). InnerVolumeSpecName "kube-api-access-fp66p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.707821 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:41 crc kubenswrapper[4885]: W0308 19:51:41.711333 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4353f36_d8f9_41ff_8062_f874bd53ef12.slice/crio-d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e WatchSource:0}: Error finding container d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e: Status 404 returned error can't find the container with id d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800190 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800210 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800221 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800231 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800239 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800246 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerDied","Data":"0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999280 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerDied","Data":"386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000856 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000861 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.003687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003862 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003885 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003955 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:44.003934245 +0000 UTC m=+1205.399988268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.016056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerStarted","Data":"701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.016320 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019256 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerDied","Data":"bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019316 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019371 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerDied","Data":"9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028501 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028564 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.031499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.031592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerStarted","Data":"d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.038748 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podStartSLOduration=3.03873029 podStartE2EDuration="3.03873029s" podCreationTimestamp="2026-03-08 19:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:42.035488554 +0000 UTC m=+1203.431542587" watchObservedRunningTime="2026-03-08 19:51:42.03873029 +0000 UTC m=+1203.434784313" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.086014 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.092346 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:43 crc kubenswrapper[4885]: I0308 19:51:43.376553 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" path="/var/lib/kubelet/pods/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d/volumes" Mar 08 19:51:44 crc kubenswrapper[4885]: I0308 19:51:44.044632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.044778 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.045457 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.045510 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:48.045493935 +0000 UTC m=+1209.441547958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.212941 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213249 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213261 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213287 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213293 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213312 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213324 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213330 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213463 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213480 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213490 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.214532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.216512 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.221257 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.272285 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.272398 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.374462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.374539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.375265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.392418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.580207 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.070140 4885 generic.go:334] "Generic (PLEG): container finished" podID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" exitCode=0 Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.070209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.072734 4885 generic.go:334] "Generic (PLEG): container finished" podID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerID="f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd" exitCode=0 Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.072803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd"} Mar 08 19:51:48 crc kubenswrapper[4885]: I0308 19:51:48.125790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126003 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126223 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126284 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:56.126266942 +0000 UTC m=+1217.522320955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.537259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.602527 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.602742 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" containerID="cri-o://b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" gracePeriod=10 Mar 08 19:51:50 crc kubenswrapper[4885]: E0308 19:51:50.575664 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201fa134_20f7_4902_8fd4_ba352e7f4e95.slice/crio-conmon-b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:51:50 crc kubenswrapper[4885]: I0308 19:51:50.846172 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.133882 4885 generic.go:334] "Generic (PLEG): container finished" podID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerID="b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" exitCode=0 Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.133994 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97"} Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.477500 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 19:51:53 crc kubenswrapper[4885]: I0308 19:51:53.428989 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" probeResult="failure" output=< Mar 08 19:51:53 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 19:51:53 crc kubenswrapper[4885]: > Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.282190 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07" Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.282789 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfl6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-pq8mq_openstack(618b5189-8b29-473f-b59c-e911fca71041): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.284055 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-pq8mq" podUID="618b5189-8b29-473f-b59c-e911fca71041" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.584286 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667228 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.671682 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q" (OuterVolumeSpecName: "kube-api-access-6jm7q") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "kube-api-access-6jm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.700657 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.709004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.710450 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.712778 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config" (OuterVolumeSpecName: "config") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769409 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769447 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769465 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769477 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769489 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.825284 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:55 crc kubenswrapper[4885]: W0308 19:51:55.828368 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b6d8115_d92a_4305_a2d2_8d9874a81390.slice/crio-bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb WatchSource:0}: Error finding container bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb: Status 404 returned error can't find the container with id bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.176301 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176544 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176581 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176642 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:52:12.176620897 +0000 UTC m=+1233.572674940 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.191761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.192073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.195522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.195757 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.197700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerStarted","Data":"2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200149 4885 scope.go:117] "RemoveContainer" containerID="b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200412 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.201887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerStarted","Data":"58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.201974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerStarted","Data":"bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb"} Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.204370 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07\\\"\"" pod="openstack/glance-db-sync-pq8mq" podUID="618b5189-8b29-473f-b59c-e911fca71041" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.228562 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.851361303 podStartE2EDuration="1m4.228535369s" podCreationTimestamp="2026-03-08 19:50:52 +0000 UTC" firstStartedPulling="2026-03-08 19:50:54.394218989 +0000 UTC m=+1155.790273012" lastFinishedPulling="2026-03-08 19:51:11.771393025 +0000 UTC m=+1173.167447078" observedRunningTime="2026-03-08 19:51:56.227787999 +0000 UTC m=+1217.623842032" watchObservedRunningTime="2026-03-08 19:51:56.228535369 +0000 UTC m=+1217.624589422" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.236862 4885 scope.go:117] "RemoveContainer" containerID="e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.263954 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mn5x8" podStartSLOduration=2.611882495 podStartE2EDuration="16.263934311s" podCreationTimestamp="2026-03-08 19:51:40 +0000 UTC" firstStartedPulling="2026-03-08 19:51:41.713397261 +0000 UTC m=+1203.109451284" lastFinishedPulling="2026-03-08 19:51:55.365449047 +0000 UTC m=+1216.761503100" observedRunningTime="2026-03-08 19:51:56.260422218 +0000 UTC m=+1217.656476251" watchObservedRunningTime="2026-03-08 19:51:56.263934311 +0000 UTC m=+1217.659988344" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.283931 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2b5kk" podStartSLOduration=11.283899973 podStartE2EDuration="11.283899973s" podCreationTimestamp="2026-03-08 19:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:56.277696907 +0000 UTC m=+1217.673750940" watchObservedRunningTime="2026-03-08 19:51:56.283899973 +0000 UTC m=+1217.679953996" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.314432 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.289334076 podStartE2EDuration="1m5.314413314s" podCreationTimestamp="2026-03-08 19:50:51 +0000 UTC" firstStartedPulling="2026-03-08 19:50:53.527935881 +0000 UTC m=+1154.923989904" lastFinishedPulling="2026-03-08 19:51:12.553015119 +0000 UTC m=+1173.949069142" observedRunningTime="2026-03-08 19:51:56.308350373 +0000 UTC m=+1217.704404396" watchObservedRunningTime="2026-03-08 19:51:56.314413314 +0000 UTC m=+1217.710467337" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.355900 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.364767 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.213059 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerID="58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c" exitCode=0 Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.213106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerDied","Data":"58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c"} Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.385768 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" path="/var/lib/kubelet/pods/201fa134-20f7-4902-8fd4-ba352e7f4e95/volumes" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.397484 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" probeResult="failure" output=< Mar 08 19:51:58 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 19:51:58 crc kubenswrapper[4885]: > Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.464185 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.471803 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.602644 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690102 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690387 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690413 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690421 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690434 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="init" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="init" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690595 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690604 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.691123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.693135 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.705915 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.726978 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"3b6d8115-d92a-4305-a2d2-8d9874a81390\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.727151 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"3b6d8115-d92a-4305-a2d2-8d9874a81390\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.728011 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b6d8115-d92a-4305-a2d2-8d9874a81390" (UID: "3b6d8115-d92a-4305-a2d2-8d9874a81390"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.733192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k" (OuterVolumeSpecName: "kube-api-access-9nr5k") pod "3b6d8115-d92a-4305-a2d2-8d9874a81390" (UID: "3b6d8115-d92a-4305-a2d2-8d9874a81390"). InnerVolumeSpecName "kube-api-access-9nr5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.828931 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829029 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829197 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829212 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.930785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.930977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931098 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931151 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931259 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931324 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.932733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.956304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.006728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.245719 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerDied","Data":"bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb"} Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.246261 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.245754 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.262043 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.263888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.267600 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.267952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.269658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.290472 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.317996 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.357958 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.459324 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.481899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.581632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.038787 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:01 crc kubenswrapper[4885]: W0308 19:52:01.046835 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb9e9e0_35d3_4473_ad7b_7b44fd44e8ec.slice/crio-79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8 WatchSource:0}: Error finding container 79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8: Status 404 returned error can't find the container with id 79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8 Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261371 4885 generic.go:334] "Generic (PLEG): container finished" podID="b08aa6bb-932f-4790-a637-f3667471149c" containerID="d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad" exitCode=0 Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerDied","Data":"d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad"} Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerStarted","Data":"d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854"} Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.263285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerStarted","Data":"79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8"} Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.277557 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerID="2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9" exitCode=0 Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.278179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerDied","Data":"2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9"} Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.667178 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711194 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711375 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712044 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712063 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711994 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run" (OuterVolumeSpecName: "var-run") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712262 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts" (OuterVolumeSpecName: "scripts") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.717215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k" (OuterVolumeSpecName: "kube-api-access-4j46k") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "kube-api-access-4j46k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.814667 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815098 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815114 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815125 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.287752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerDied","Data":"d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854"} Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.287803 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.288869 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.289611 4885 generic.go:334] "Generic (PLEG): container finished" podID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerID="e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77" exitCode=0 Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.289711 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerDied","Data":"e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77"} Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.412144 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mn4lz" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.616018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.727966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728127 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728276 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728297 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728992 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.729071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.754495 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w" (OuterVolumeSpecName: "kube-api-access-rlg4w") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "kube-api-access-rlg4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.757747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.759375 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.759505 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.768628 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts" (OuterVolumeSpecName: "scripts") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.799230 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.813638 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830103 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830136 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830151 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830162 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830174 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830185 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830200 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.841629 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:03 crc kubenswrapper[4885]: E0308 19:52:03.842009 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842026 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: E0308 19:52:03.842046 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842232 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.844493 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.856369 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931657 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931697 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931745 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033339 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033496 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.034801 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.035893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.059850 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.168732 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303022 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerDied","Data":"d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e"} Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303698 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.653490 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.661130 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:04 crc kubenswrapper[4885]: W0308 19:52:04.665261 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795a0dcb_c61f_4d3b_8924_4f04474af216.slice/crio-40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab WatchSource:0}: Error finding container 40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab: Status 404 returned error can't find the container with id 40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.844997 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.855330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s" (OuterVolumeSpecName: "kube-api-access-j6w8s") pod "deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" (UID: "deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec"). InnerVolumeSpecName "kube-api-access-j6w8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.947538 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312756 4885 generic.go:334] "Generic (PLEG): container finished" podID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerID="d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a" exitCode=0 Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerDied","Data":"d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerStarted","Data":"40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315373 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerDied","Data":"79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315406 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315469 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.388798 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08aa6bb-932f-4790-a637-f3667471149c" path="/var/lib/kubelet/pods/b08aa6bb-932f-4790-a637-f3667471149c/volumes" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.742474 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.757585 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.755146 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.888695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889734 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889886 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890060 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run" (OuterVolumeSpecName: "var-run") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts" (OuterVolumeSpecName: "scripts") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891325 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891380 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891410 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891440 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891464 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.896515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m" (OuterVolumeSpecName: "kube-api-access-8c76m") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "kube-api-access-8c76m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.993193 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerDied","Data":"40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab"} Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339272 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339345 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.382960 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" path="/var/lib/kubelet/pods/fa96ded3-40b5-4e54-9f54-72f64edfb672/volumes" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.881196 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.889830 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:09 crc kubenswrapper[4885]: I0308 19:52:09.392393 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" path="/var/lib/kubelet/pods/795a0dcb-c61f-4d3b-8924-4f04474af216/volumes" Mar 08 19:52:11 crc kubenswrapper[4885]: I0308 19:52:11.383337 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerStarted","Data":"88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090"} Mar 08 19:52:11 crc kubenswrapper[4885]: I0308 19:52:11.410464 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pq8mq" podStartSLOduration=2.828529803 podStartE2EDuration="34.410351643s" podCreationTimestamp="2026-03-08 19:51:37 +0000 UTC" firstStartedPulling="2026-03-08 19:51:38.249720449 +0000 UTC m=+1199.645774482" lastFinishedPulling="2026-03-08 19:52:09.831542269 +0000 UTC m=+1231.227596322" observedRunningTime="2026-03-08 19:52:11.402229735 +0000 UTC m=+1232.798283768" watchObservedRunningTime="2026-03-08 19:52:11.410351643 +0000 UTC m=+1232.806405666" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.181774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.192595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.338351 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.782254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:52:12 crc kubenswrapper[4885]: W0308 19:52:12.785172 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa276a05_ab6a_4aa1_9a9f_a990dc1513bd.slice/crio-b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158 WatchSource:0}: Error finding container b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158: Status 404 returned error can't find the container with id b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158 Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.009304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317489 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:13 crc kubenswrapper[4885]: E0308 19:52:13.317826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317844 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: E0308 19:52:13.317861 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318070 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318547 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.327396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.432730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.438455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.440011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158"} Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.440845 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.456884 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.509852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.509910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.611297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.611988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.612164 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.612314 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.613384 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.618979 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.620096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.641376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.642430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.647213 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.657394 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.668550 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.668572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.673653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.713979 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.714227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.715039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.715191 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.716399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.725882 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726697 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726948 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.740131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.741571 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.742489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.749485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.771198 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815862 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815952 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.821090 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.850948 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.852063 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.859129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.869264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917820 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.918038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.918069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.919543 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.922944 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.934631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.935191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.941195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.949277 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.956792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.993077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.019766 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.019900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.020011 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.020064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.021649 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.039405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.042602 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.121215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.121260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.122219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.144056 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.145817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.175551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.234263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.314164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.439879 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.448644 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.520668 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.683831 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57032abe_6c4f_4711_9f48_5733d6a29ec3.slice/crio-90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911 WatchSource:0}: Error finding container 90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911: Status 404 returned error can't find the container with id 90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911 Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.686075 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9cdd234_0e3f_4bd4_9382_1f4ca59aeb44.slice/crio-5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e WatchSource:0}: Error finding container 5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e: Status 404 returned error can't find the container with id 5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.691282 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915cd482_d3dc_42c1_96cc_0fcc18bbaff2.slice/crio-dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead WatchSource:0}: Error finding container dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead: Status 404 returned error can't find the container with id dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.041550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.358306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.364278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470794 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerID="a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d" exitCode=0 Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerDied","Data":"a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerStarted","Data":"5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.474465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerStarted","Data":"ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.475861 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerStarted","Data":"41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.475885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerStarted","Data":"dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.482604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerStarted","Data":"33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.482645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerStarted","Data":"7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.492100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerStarted","Data":"b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.499800 4885 generic.go:334] "Generic (PLEG): container finished" podID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerID="17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a" exitCode=0 Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.500014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerDied","Data":"17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.500136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerStarted","Data":"90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.506579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerStarted","Data":"712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.524394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.524435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.534028 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-031a-account-create-update-qvdpr" podStartSLOduration=2.534007511 podStartE2EDuration="2.534007511s" podCreationTimestamp="2026-03-08 19:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:15.519297608 +0000 UTC m=+1236.915351631" watchObservedRunningTime="2026-03-08 19:52:15.534007511 +0000 UTC m=+1236.930061534" Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.534521 4885 generic.go:334] "Generic (PLEG): container finished" podID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerID="41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.535018 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerDied","Data":"41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.536460 4885 generic.go:334] "Generic (PLEG): container finished" podID="85d25daf-f279-4be1-be4a-75e05e47923c" containerID="33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.536523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerDied","Data":"33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.545440 4885 generic.go:334] "Generic (PLEG): container finished" podID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerID="fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.545612 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerDied","Data":"fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.553794 4885 generic.go:334] "Generic (PLEG): container finished" podID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerID="6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.553858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerDied","Data":"6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.558425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.558453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} Mar 08 19:52:17 crc kubenswrapper[4885]: I0308 19:52:17.576491 4885 generic.go:334] "Generic (PLEG): container finished" podID="618b5189-8b29-473f-b59c-e911fca71041" containerID="88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090" exitCode=0 Mar 08 19:52:17 crc kubenswrapper[4885]: I0308 19:52:17.576526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerDied","Data":"88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.549035 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.556988 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.565426 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.579373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.600097 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.605385 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerDied","Data":"7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618164 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618237 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620379 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620547 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerDied","Data":"b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620576 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620612 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622458 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622458 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerDied","Data":"90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622561 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerDied","Data":"712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624209 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624241 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.628388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"85d25daf-f279-4be1-be4a-75e05e47923c\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.628547 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"85d25daf-f279-4be1-be4a-75e05e47923c\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.629193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85d25daf-f279-4be1-be4a-75e05e47923c" (UID: "85d25daf-f279-4be1-be4a-75e05e47923c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630183 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerDied","Data":"5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630519 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerDied","Data":"6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637075 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.643333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr" (OuterVolumeSpecName: "kube-api-access-nsfwr") pod "85d25daf-f279-4be1-be4a-75e05e47923c" (UID: "85d25daf-f279-4be1-be4a-75e05e47923c"). InnerVolumeSpecName "kube-api-access-nsfwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerDied","Data":"dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647345 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729680 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729867 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730468 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" (UID: "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730573 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730853 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "915cd482-d3dc-42c1-96cc-0fcc18bbaff2" (UID: "915cd482-d3dc-42c1-96cc-0fcc18bbaff2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730872 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" (UID: "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"57032abe-6c4f-4711-9f48-5733d6a29ec3\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731018 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"57032abe-6c4f-4711-9f48-5733d6a29ec3\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57032abe-6c4f-4711-9f48-5733d6a29ec3" (UID: "57032abe-6c4f-4711-9f48-5733d6a29ec3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a7dd20b-387a-4061-ab5a-a53ee6a240ef" (UID: "4a7dd20b-387a-4061-ab5a-a53ee6a240ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735335 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm" (OuterVolumeSpecName: "kube-api-access-9nzxm") pod "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" (UID: "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44"). InnerVolumeSpecName "kube-api-access-9nzxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms" (OuterVolumeSpecName: "kube-api-access-xxqms") pod "915cd482-d3dc-42c1-96cc-0fcc18bbaff2" (UID: "915cd482-d3dc-42c1-96cc-0fcc18bbaff2"). InnerVolumeSpecName "kube-api-access-xxqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p" (OuterVolumeSpecName: "kube-api-access-sfl6p") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "kube-api-access-sfl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.736199 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8" (OuterVolumeSpecName: "kube-api-access-78wf8") pod "4a7dd20b-387a-4061-ab5a-a53ee6a240ef" (UID: "4a7dd20b-387a-4061-ab5a-a53ee6a240ef"). InnerVolumeSpecName "kube-api-access-78wf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737090 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737122 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737130 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737139 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737148 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737157 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737167 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737176 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737184 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737193 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737340 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.741717 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454" (OuterVolumeSpecName: "kube-api-access-hp454") pod "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" (UID: "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb"). InnerVolumeSpecName "kube-api-access-hp454". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.756576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl" (OuterVolumeSpecName: "kube-api-access-k4swl") pod "57032abe-6c4f-4711-9f48-5733d6a29ec3" (UID: "57032abe-6c4f-4711-9f48-5733d6a29ec3"). InnerVolumeSpecName "kube-api-access-k4swl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.771008 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.785629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data" (OuterVolumeSpecName: "config-data") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838438 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838484 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838494 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838503 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838512 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074437 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.074968 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074979 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.074991 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074998 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075011 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075018 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075026 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075032 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075041 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075059 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075065 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075073 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075079 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075211 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075224 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075237 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075244 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075258 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075269 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.078032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.092856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.174098 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.174127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.275638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.275913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.277137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279558 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.292131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.493257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.690510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerStarted","Data":"1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.706013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.706262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.716486 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zfp8t" podStartSLOduration=2.35271424 podStartE2EDuration="8.716474049s" podCreationTimestamp="2026-03-08 19:52:13 +0000 UTC" firstStartedPulling="2026-03-08 19:52:14.703507225 +0000 UTC m=+1236.099561248" lastFinishedPulling="2026-03-08 19:52:21.067267034 +0000 UTC m=+1242.463321057" observedRunningTime="2026-03-08 19:52:21.712663247 +0000 UTC m=+1243.108717270" watchObservedRunningTime="2026-03-08 19:52:21.716474049 +0000 UTC m=+1243.112528072" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.790964 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.051098 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.719953 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" exitCode=0 Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.720044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.720081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerStarted","Data":"2371621d73898b5a7be293c5570047480578f0621eb75c65a2d656b40ff52a84"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.750056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.750116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.831014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.831355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.856826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerStarted","Data":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.857816 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.895406 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" podStartSLOduration=2.895386572 podStartE2EDuration="2.895386572s" podCreationTimestamp="2026-03-08 19:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:23.881609533 +0000 UTC m=+1245.277663556" watchObservedRunningTime="2026-03-08 19:52:23.895386572 +0000 UTC m=+1245.291440595" Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870192 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.913132 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.372379833 podStartE2EDuration="45.913114598s" podCreationTimestamp="2026-03-08 19:51:39 +0000 UTC" firstStartedPulling="2026-03-08 19:52:12.789843201 +0000 UTC m=+1234.185897234" lastFinishedPulling="2026-03-08 19:52:23.330577976 +0000 UTC m=+1244.726631999" observedRunningTime="2026-03-08 19:52:24.906300356 +0000 UTC m=+1246.302354379" watchObservedRunningTime="2026-03-08 19:52:24.913114598 +0000 UTC m=+1246.309168611" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.177626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.208096 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.209765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.219730 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.224047 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356814 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.458993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459226 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.460479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.460969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.461390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.477693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.537885 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.887477 4885 generic.go:334] "Generic (PLEG): container finished" podID="321f89cf-ed1f-4f10-a198-e55c23171363" containerID="1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9" exitCode=0 Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.887627 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerDied","Data":"1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9"} Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.956723 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.900810 4885 generic.go:334] "Generic (PLEG): container finished" podID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" exitCode=0 Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.900894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70"} Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.901590 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerStarted","Data":"a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98"} Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.901742 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" containerID="cri-o://a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" gracePeriod=10 Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.232836 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.312039 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389362 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389438 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389539 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.395078 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr" (OuterVolumeSpecName: "kube-api-access-xvwxr") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "kube-api-access-xvwxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.395848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq" (OuterVolumeSpecName: "kube-api-access-l6nhq") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "kube-api-access-l6nhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.421929 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.432622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config" (OuterVolumeSpecName: "config") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.436319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.438400 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.439015 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.445877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data" (OuterVolumeSpecName: "config-data") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491467 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491498 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491511 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491522 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491530 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491540 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491549 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491557 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.912089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerStarted","Data":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.912319 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.913880 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.914162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerDied","Data":"ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.914200 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.915979 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" exitCode=0 Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"2371621d73898b5a7be293c5570047480578f0621eb75c65a2d656b40ff52a84"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916063 4885 scope.go:117] "RemoveContainer" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916184 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.955430 4885 scope.go:117] "RemoveContainer" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.956709 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" podStartSLOduration=2.956687492 podStartE2EDuration="2.956687492s" podCreationTimestamp="2026-03-08 19:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:27.94805823 +0000 UTC m=+1249.344112263" watchObservedRunningTime="2026-03-08 19:52:27.956687492 +0000 UTC m=+1249.352741525" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.976040 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.992381 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.992531 4885 scope.go:117] "RemoveContainer" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: E0308 19:52:27.993008 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": container with ID starting with a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef not found: ID does not exist" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993040 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} err="failed to get container status \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": rpc error: code = NotFound desc = could not find container \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": container with ID starting with a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef not found: ID does not exist" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993058 4885 scope.go:117] "RemoveContainer" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: E0308 19:52:27.993415 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": container with ID starting with 5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf not found: ID does not exist" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993449 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf"} err="failed to get container status \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": rpc error: code = NotFound desc = could not find container \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": container with ID starting with 5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf not found: ID does not exist" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.161635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234440 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234769 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="init" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234784 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="init" Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234813 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234820 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234830 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234837 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234994 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.235014 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.235791 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.304067 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.306504 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.310638 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311103 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311258 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311981 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.329792 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.342269 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408900 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409020 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409049 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409065 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409084 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409101 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512843 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512912 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512978 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.513014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.513035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.514028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.514770 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.515808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.516344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.521473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.523572 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.531763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.539366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.543549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.546954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.552073 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.555272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.580454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.615334 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.616337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628495 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x9f48" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628958 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.631246 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.714080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722510 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.745597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.750805 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.754889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.755074 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kg22q" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.755369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824178 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824314 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.825303 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.825473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.857375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.858435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.858523 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.859749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.860872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.864811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.864946 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.872936 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.874683 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.874894 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.883048 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.886178 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xgk7z" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.928946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.928993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929015 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929165 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929211 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.933317 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.934406 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.934776 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.938513 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.944534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.945144 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.960772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.962293 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.963494 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.997521 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.997805 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrht4" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.000286 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.007034 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.014414 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.016652 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.021615 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.021735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.022070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030260 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030302 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030351 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030439 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030510 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.037774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.039709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.055660 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.107609 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.131588 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133213 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133277 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133338 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134398 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.139848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.152181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.152531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.165619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.193693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235439 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235457 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.236312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.236714 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.244956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.245062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.245442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.248223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.257166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.268317 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.270577 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.295845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.332711 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.382133 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" path="/var/lib/kubelet/pods/c6fd8def-cffc-4f64-9805-55040dae82c6/volumes" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.437573 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.578959 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.580509 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586146 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586317 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586605 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zffrj" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.601064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.672035 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.674050 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.677346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.677490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757355 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.771856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.814853 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859618 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859826 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860374 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860607 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.865631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.866760 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.875363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.879529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.892792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.915279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.942587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961878 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963531 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.970534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.975146 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.977379 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980580 4885 generic.go:334] "Generic (PLEG): container finished" podID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerID="24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6" exitCode=0 Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerDied","Data":"24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980896 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerStarted","Data":"cf44c30d4d8bfe81037b523dba452fde19d3e3d6ea7ee6356c19f638cd1925fd"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.987829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.991422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.991420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerStarted","Data":"25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.995444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.000207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerStarted","Data":"74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.000255 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerStarted","Data":"9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.002083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.013175 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" containerID="cri-o://efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" gracePeriod=10 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.013287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerStarted","Data":"d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.030630 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n7295" podStartSLOduration=2.030612484 podStartE2EDuration="2.030612484s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:30.028442376 +0000 UTC m=+1251.424496399" watchObservedRunningTime="2026-03-08 19:52:30.030612484 +0000 UTC m=+1251.426666507" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.085747 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.126658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.189866 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:30 crc kubenswrapper[4885]: W0308 19:52:30.204537 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46290bd2_6ad7_46f4_86f4_48aa73bc304a.slice/crio-e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268 WatchSource:0}: Error finding container e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268: Status 404 returned error can't find the container with id e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.338634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.368972 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.505074 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.539913 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582612 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582642 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.584444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.584569 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.591809 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff" (OuterVolumeSpecName: "kube-api-access-m5qff") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "kube-api-access-m5qff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.631164 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.640209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.690080 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.690114 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.698485 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.706457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.742014 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.749442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config" (OuterVolumeSpecName: "config") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.771325 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793480 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793503 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793543 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793556 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.809260 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: W0308 19:52:30.846311 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138479fb_965c_4e04_bd95_c7a683a5b0be.slice/crio-7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1 WatchSource:0}: Error finding container 7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1: Status 404 returned error can't find the container with id 7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.920950 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031267 4885 generic.go:334] "Generic (PLEG): container finished" podID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerID="1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a" exitCode=0 Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031330 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerStarted","Data":"3afeb0eb326591c1d8125d846debd099f1fede59cbbe242947add0ce27db5e6c"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.042166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"7df05fec0614fac93aab8ae29777fc59abe79ffd7aca5937f60cd7d723e3ec63"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.053770 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.082361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerStarted","Data":"e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097306 4885 generic.go:334] "Generic (PLEG): container finished" podID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" exitCode=0 Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097400 4885 scope.go:117] "RemoveContainer" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099152 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.107621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerDied","Data":"cf44c30d4d8bfe81037b523dba452fde19d3e3d6ea7ee6356c19f638cd1925fd"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.107757 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.110675 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk" (OuterVolumeSpecName: "kube-api-access-cclxk") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "kube-api-access-cclxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.121070 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerStarted","Data":"f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.121116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerStarted","Data":"a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.125717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.151623 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.166818 4885 scope.go:117] "RemoveContainer" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.202137 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kpvl2" podStartSLOduration=3.18691909 podStartE2EDuration="3.18691909s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:31.14024914 +0000 UTC m=+1252.536303173" watchObservedRunningTime="2026-03-08 19:52:31.18691909 +0000 UTC m=+1252.582973113" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.206709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.206752 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.236047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.248458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.283619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config" (OuterVolumeSpecName: "config") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309034 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309414 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309442 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309454 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.319042 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.329529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.384664 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" path="/var/lib/kubelet/pods/991f909f-207b-4663-adea-a4f8cd0c1cb6/volumes" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.423244 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.446134 4885 scope.go:117] "RemoveContainer" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.495677 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": container with ID starting with efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920 not found: ID does not exist" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.495736 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} err="failed to get container status \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": rpc error: code = NotFound desc = could not find container \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": container with ID starting with efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920 not found: ID does not exist" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.495763 4885 scope.go:117] "RemoveContainer" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.505228 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": container with ID starting with bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70 not found: ID does not exist" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.505506 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70"} err="failed to get container status \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": rpc error: code = NotFound desc = could not find container \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": container with ID starting with bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70 not found: ID does not exist" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.505927 4885 scope.go:117] "RemoveContainer" containerID="24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.816595 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.826282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.888746 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb502bd3e_eafb_44cf_a81e_c10d647302a4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb502bd3e_eafb_44cf_a81e_c10d647302a4.slice/crio-a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98\": RecentStats: unable to find data in memory cache]" Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.163408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerStarted","Data":"7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e"} Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.163792 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.171180 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"34d894a56c80dc928bc7836837d51c9e6fa877a5201b35158a80f5ce7bf422e1"} Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.206674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.220643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.220753 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" containerID="cri-o://911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" gracePeriod=30 Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.221019 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" containerID="cri-o://523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" gracePeriod=30 Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.235573 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.247576 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" podStartSLOduration=5.247553467 podStartE2EDuration="5.247553467s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:32.202348005 +0000 UTC m=+1253.598402028" watchObservedRunningTime="2026-03-08 19:52:33.247553467 +0000 UTC m=+1254.643607490" Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.255253 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.255235133 podStartE2EDuration="5.255235133s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:33.245842331 +0000 UTC m=+1254.641896354" watchObservedRunningTime="2026-03-08 19:52:33.255235133 +0000 UTC m=+1254.651289156" Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.389667 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" path="/var/lib/kubelet/pods/b502bd3e-eafb-44cf-a81e-c10d647302a4/volumes" Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256278 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" containerID="cri-o://f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" gracePeriod=30 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256608 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" containerID="cri-o://25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" gracePeriod=30 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262401 4885 generic.go:334] "Generic (PLEG): container finished" podID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerID="523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" exitCode=0 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262432 4885 generic.go:334] "Generic (PLEG): container finished" podID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerID="911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" exitCode=143 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.278532 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.278513138 podStartE2EDuration="6.278513138s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:34.273867173 +0000 UTC m=+1255.669921196" watchObservedRunningTime="2026-03-08 19:52:34.278513138 +0000 UTC m=+1255.674567161" Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.289744 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerID="74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7" exitCode=0 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.289838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerDied","Data":"74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295193 4885 generic.go:334] "Generic (PLEG): container finished" podID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerID="25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" exitCode=0 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295263 4885 generic.go:334] "Generic (PLEG): container finished" podID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerID="f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" exitCode=143 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295399 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.968115 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054375 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055152 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs" (OuterVolumeSpecName: "logs") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055557 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055574 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060387 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7" (OuterVolumeSpecName: "kube-api-access-ljzt7") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "kube-api-access-ljzt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060583 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts" (OuterVolumeSpecName: "scripts") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.087094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.109958 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.111573 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data" (OuterVolumeSpecName: "config-data") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157943 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157978 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157988 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158035 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158062 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158073 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.178960 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.260953 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.309818 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.315231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1"} Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.315283 4885 scope.go:117] "RemoveContainer" containerID="523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.362267 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.372844 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379820 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379843 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379849 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379862 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379869 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379888 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379912 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379940 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380115 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380131 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380139 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.381431 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.383440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.386163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.400109 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.578577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586278 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586539 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688616 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688758 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689024 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689742 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.697721 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.697799 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.707806 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.708379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.717997 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.788833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:37 crc kubenswrapper[4885]: I0308 19:52:37.005845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:37 crc kubenswrapper[4885]: I0308 19:52:37.389717 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" path="/var/lib/kubelet/pods/138479fb-965c-4e04-bd95-c7a683a5b0be/volumes" Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.270133 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.329453 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.329726 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" containerID="cri-o://701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" gracePeriod=10 Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.535681 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 08 19:52:40 crc kubenswrapper[4885]: I0308 19:52:40.370039 4885 generic.go:334] "Generic (PLEG): container finished" podID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerID="701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" exitCode=0 Mar 08 19:52:40 crc kubenswrapper[4885]: I0308 19:52:40.370281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5"} Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.855661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.864285 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999431 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999572 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999936 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs" (OuterVolumeSpecName: "logs") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000328 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000389 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000686 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000855 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000951 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.005483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts" (OuterVolumeSpecName: "scripts") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006372 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006390 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006401 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts" (OuterVolumeSpecName: "scripts") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009393 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.010855 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.011287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x" (OuterVolumeSpecName: "kube-api-access-c8l7x") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "kube-api-access-c8l7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.015388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc" (OuterVolumeSpecName: "kube-api-access-kq4fc") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "kube-api-access-kq4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.030231 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.032214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.032838 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data" (OuterVolumeSpecName: "config-data") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.063656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data" (OuterVolumeSpecName: "config-data") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.080484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107872 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107910 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107968 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107982 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107995 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108006 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108018 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108030 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108040 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108050 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108061 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.128157 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.209114 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.400965 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerDied","Data":"9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a"} Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.401013 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.401046 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.404285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"34d894a56c80dc928bc7836837d51c9e6fa877a5201b35158a80f5ce7bf422e1"} Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.404467 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.454997 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.469193 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.506215 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512521 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512599 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512605 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512619 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512627 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512937 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512966 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.513984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.516004 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.517479 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.522912 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615549 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.616005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.616127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717950 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718039 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718721 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.719495 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.719609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.722825 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.722862 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.723232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.728736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.735945 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.749821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.835477 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.942561 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.949951 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.027993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.029271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031692 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031743 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.032343 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.034375 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.038622 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125725 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125982 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.126056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.126117 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.228813 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.228894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.230842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.231044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.231107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.232317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.236082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.237050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.254202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.352614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.379242 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" path="/var/lib/kubelet/pods/68f1c367-d6c5-4e77-a051-f73fddfa1085/volumes" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.380109 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" path="/var/lib/kubelet/pods/6a7fcdff-b452-4cc7-becc-65a43827b50b/volumes" Mar 08 19:52:44 crc kubenswrapper[4885]: I0308 19:52:44.536005 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 08 19:52:47 crc kubenswrapper[4885]: I0308 19:52:47.458210 4885 generic.go:334] "Generic (PLEG): container finished" podID="b04611e7-17b5-48ae-8169-534f684a101b" containerID="f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963" exitCode=0 Mar 08 19:52:47 crc kubenswrapper[4885]: I0308 19:52:47.458528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerDied","Data":"f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963"} Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.724905 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.725500 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w6f64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-fwjsc_openstack(46290bd2-6ad7-46f4-86f4-48aa73bc304a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.726772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-fwjsc" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.747792 4885 scope.go:117] "RemoveContainer" containerID="911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.840769 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.973866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.973901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.974050 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.991814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7" (OuterVolumeSpecName: "kube-api-access-dg4r7") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "kube-api-access-dg4r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.995077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.015895 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config" (OuterVolumeSpecName: "config") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077294 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077350 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077370 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerDied","Data":"a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539"} Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510731 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510807 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:50 crc kubenswrapper[4885]: E0308 19:52:50.517087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-fwjsc" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.103781 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103793 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103996 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.104896 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.119634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.198220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.198842 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.200390 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205098 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xgk7z" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205262 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.227532 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299926 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300588 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300983 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.302186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.321841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.416212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.431733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.433678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.522101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.852929 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.853407 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlqnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mp8c9_openstack(2efc22fd-a92b-422c-876d-7b80f06928b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.854703 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mp8c9" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.884074 4885 scope.go:117] "RemoveContainer" containerID="25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.926592 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.955192 4885 scope.go:117] "RemoveContainer" containerID="f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118230 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118342 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118366 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.138020 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487" (OuterVolumeSpecName: "kube-api-access-jq487") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "kube-api-access-jq487". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.215799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.218667 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.223874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: W0308 19:52:52.224047 4885 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/67aa348d-fe05-4e05-af01-a0b22d170a9b/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224355 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224387 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224401 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.235839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.243132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config" (OuterVolumeSpecName: "config") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.325976 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.326280 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: W0308 19:52:52.456359 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43dd77c8_6951_423a_9334_502f66c3d1b5.slice/crio-147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b WatchSource:0}: Error finding container 147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b: Status 404 returned error can't find the container with id 147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.461363 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.468270 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.476002 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.555877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.566025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerStarted","Data":"302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572763 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"e012096eca7d05b70c67ac3ab0c256b21f51521655fd61aa44ededf7f87ad72c"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572835 4885 scope.go:117] "RemoveContainer" containerID="701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572787 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.582530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"4a7d8fc5b878c3d0d7e4f116b765cb88438382af5862067e0b9a50b02bb40fea"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.586262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerStarted","Data":"147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.614963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:52 crc kubenswrapper[4885]: E0308 19:52:52.620699 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-mp8c9" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.625573 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zz9c5" podStartSLOduration=4.837675388 podStartE2EDuration="24.625551328s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:29.962998825 +0000 UTC m=+1251.359052858" lastFinishedPulling="2026-03-08 19:52:49.750874775 +0000 UTC m=+1271.146928798" observedRunningTime="2026-03-08 19:52:52.582650211 +0000 UTC m=+1273.978704234" watchObservedRunningTime="2026-03-08 19:52:52.625551328 +0000 UTC m=+1274.021605351" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.647758 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.648182 4885 scope.go:117] "RemoveContainer" containerID="b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.655401 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.662619 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.673735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.384474 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" path="/var/lib/kubelet/pods/67aa348d-fe05-4e05-af01-a0b22d170a9b/volumes" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.598099 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerStarted","Data":"6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.600105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.600146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"0ea29bb519254f2f6d232c0a073b9e8199e006c424f0d72c1ebe3ec8f1381dff"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"ce2e04e0c937557f90cd3e8f47d07607bc0f9d0c6eb93b3ee180bc8da569c97b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602265 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603708 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerID="695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc" exitCode=0 Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerStarted","Data":"d542f54b493f5cfa9eb7867bd9658f59584a89f76c6986f57f0a54179d70ccd3"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.608104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.622006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j97wh" podStartSLOduration=10.621990936 podStartE2EDuration="10.621990936s" podCreationTimestamp="2026-03-08 19:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:53.615272426 +0000 UTC m=+1275.011326449" watchObservedRunningTime="2026-03-08 19:52:53.621990936 +0000 UTC m=+1275.018044959" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.639380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bbc5d6644-tztss" podStartSLOduration=2.63936361 podStartE2EDuration="2.63936361s" podCreationTimestamp="2026-03-08 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:53.63409905 +0000 UTC m=+1275.030153063" watchObservedRunningTime="2026-03-08 19:52:53.63936361 +0000 UTC m=+1275.035417633" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.988902 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:53 crc kubenswrapper[4885]: E0308 19:52:53.989596 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989608 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: E0308 19:52:53.989629 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="init" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989638 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="init" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989797 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.990625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.996255 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.996447 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.009986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071715 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173463 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173612 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.179506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.180642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.185254 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.192765 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.371255 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.535873 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.628780 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.637708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.640894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.644817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerStarted","Data":"054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.645619 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.648803 4885 generic.go:334] "Generic (PLEG): container finished" podID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerID="302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e" exitCode=0 Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.648992 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerDied","Data":"302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.662700 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.662669957 podStartE2EDuration="18.662669957s" podCreationTimestamp="2026-03-08 19:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.652798672 +0000 UTC m=+1276.048852695" watchObservedRunningTime="2026-03-08 19:52:54.662669957 +0000 UTC m=+1276.058724020" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.694605 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.694587681 podStartE2EDuration="12.694587681s" podCreationTimestamp="2026-03-08 19:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.689264388 +0000 UTC m=+1276.085318411" watchObservedRunningTime="2026-03-08 19:52:54.694587681 +0000 UTC m=+1276.090641704" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.713120 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" podStartSLOduration=3.713099216 podStartE2EDuration="3.713099216s" podCreationTimestamp="2026-03-08 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.705563285 +0000 UTC m=+1276.101617308" watchObservedRunningTime="2026-03-08 19:52:54.713099216 +0000 UTC m=+1276.109153239" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.915521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.452766 4885 scope.go:117] "RemoveContainer" containerID="dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb" Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.668231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"276c039d660964960e48e559165b8b647e2ab8ab57d7b59f8062379f583a0dc6"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670696 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.707307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56dd4b5ff7-j89qr" podStartSLOduration=2.707289623 podStartE2EDuration="2.707289623s" podCreationTimestamp="2026-03-08 19:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:55.697899522 +0000 UTC m=+1277.093953545" watchObservedRunningTime="2026-03-08 19:52:55.707289623 +0000 UTC m=+1277.103343646" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.090058 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111623 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111770 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.113271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs" (OuterVolumeSpecName: "logs") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.120228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b" (OuterVolumeSpecName: "kube-api-access-mhz2b") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "kube-api-access-mhz2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.128374 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts" (OuterVolumeSpecName: "scripts") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.144587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data" (OuterVolumeSpecName: "config-data") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.172933 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213221 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213258 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213274 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213285 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213299 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.681061 4885 generic.go:334] "Generic (PLEG): container finished" podID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerID="6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e" exitCode=0 Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.682418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerDied","Data":"6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e"} Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.687963 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.687981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerDied","Data":"25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a"} Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.688233 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764121 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:56 crc kubenswrapper[4885]: E0308 19:52:56.764555 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764579 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764809 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.766530 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.768160 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.768390 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.772150 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.772279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kg22q" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.778007 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.779669 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.834974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835079 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936949 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936975 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.937014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.937948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.941008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.941707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.943227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.943642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.955986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.956193 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.007105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.007152 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.041059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.082946 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.093945 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.602373 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:57 crc kubenswrapper[4885]: W0308 19:52:57.633245 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30962695_4bc8_4fd2_b6e4_5b4b1f9d75a1.slice/crio-728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b WatchSource:0}: Error finding container 728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b: Status 404 returned error can't find the container with id 728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.723983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b"} Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.724218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.724756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:52:58 crc kubenswrapper[4885]: I0308 19:52:58.733131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.077406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249153 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249220 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249844 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.250372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.255555 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.255663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts" (OuterVolumeSpecName: "scripts") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.267054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.274189 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x" (OuterVolumeSpecName: "kube-api-access-b2r5x") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "kube-api-access-b2r5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.291804 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data" (OuterVolumeSpecName: "config-data") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.305089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352397 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352441 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352455 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352467 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352478 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352489 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.518284 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.573410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753689 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerDied","Data":"147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753730 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753735 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.755853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.758227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.784244 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b5685698-p87pb" podStartSLOduration=4.784211343 podStartE2EDuration="4.784211343s" podCreationTimestamp="2026-03-08 19:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:00.774292648 +0000 UTC m=+1282.170346671" watchObservedRunningTime="2026-03-08 19:53:00.784211343 +0000 UTC m=+1282.180265356" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.193690 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:01 crc kubenswrapper[4885]: E0308 19:53:01.194042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194297 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198182 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198439 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198556 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198878 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.215292 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370930 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370979 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371116 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.435074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.472896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.472987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.489373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.490265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.491485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.491679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.492282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.492824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.503511 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.503731 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" containerID="cri-o://7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" gracePeriod=10 Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.508598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.530054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.782857 4885 generic.go:334] "Generic (PLEG): container finished" podID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerID="7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" exitCode=0 Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.783696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e"} Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.783723 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.784064 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.810765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.004562 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187180 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187295 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.199153 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4" (OuterVolumeSpecName: "kube-api-access-m92h4") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "kube-api-access-m92h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.227483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.233580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config" (OuterVolumeSpecName: "config") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.233673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.241298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.242388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289566 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289596 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289606 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289615 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289623 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289631 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.331523 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:02 crc kubenswrapper[4885]: W0308 19:53:02.333167 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a28c270_c9ef_4b8c_a8e7_bcc69a1419cc.slice/crio-bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61 WatchSource:0}: Error finding container bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61: Status 404 returned error can't find the container with id bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61 Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.794509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerStarted","Data":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.794785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerStarted","Data":"bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.795384 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804858 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"3afeb0eb326591c1d8125d846debd099f1fede59cbbe242947add0ce27db5e6c"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804958 4885 scope.go:117] "RemoveContainer" containerID="7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.817728 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.817781 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.822492 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-574d5c476f-sq4hm" podStartSLOduration=1.822474712 podStartE2EDuration="1.822474712s" podCreationTimestamp="2026-03-08 19:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:02.814073087 +0000 UTC m=+1284.210127120" watchObservedRunningTime="2026-03-08 19:53:02.822474712 +0000 UTC m=+1284.218528735" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.839574 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.839615 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.846416 4885 scope.go:117] "RemoveContainer" containerID="1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.846643 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.853889 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.882259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.889400 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.409373 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" path="/var/lib/kubelet/pods/4e1d0a39-d199-43d0-bdea-33c0ecfff06f/volumes" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.821197 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.821561 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.940260 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.841404 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerStarted","Data":"155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d"} Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.844027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerStarted","Data":"43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c"} Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.861430 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fwjsc" podStartSLOduration=3.224275969 podStartE2EDuration="36.861409829s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:30.206775219 +0000 UTC m=+1251.602829242" lastFinishedPulling="2026-03-08 19:53:03.843909069 +0000 UTC m=+1285.239963102" observedRunningTime="2026-03-08 19:53:04.858728967 +0000 UTC m=+1286.254783010" watchObservedRunningTime="2026-03-08 19:53:04.861409829 +0000 UTC m=+1286.257463862" Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.879698 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mp8c9" podStartSLOduration=2.891526095 podStartE2EDuration="36.879674168s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:29.816070173 +0000 UTC m=+1251.212124196" lastFinishedPulling="2026-03-08 19:53:03.804218246 +0000 UTC m=+1285.200272269" observedRunningTime="2026-03-08 19:53:04.874682934 +0000 UTC m=+1286.270736977" watchObservedRunningTime="2026-03-08 19:53:04.879674168 +0000 UTC m=+1286.275728191" Mar 08 19:53:05 crc kubenswrapper[4885]: I0308 19:53:05.861195 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:53:05 crc kubenswrapper[4885]: I0308 19:53:05.861453 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.022254 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.023523 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.873319 4885 generic.go:334] "Generic (PLEG): container finished" podID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerID="155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d" exitCode=0 Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.873414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerDied","Data":"155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d"} Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.655590 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.704858 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.704946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.705046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.714033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.714091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64" (OuterVolumeSpecName: "kube-api-access-w6f64") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "kube-api-access-w6f64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.734857 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807868 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807907 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807936 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerDied","Data":"e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268"} Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929500 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.059049 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.059946 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.059960 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.059997 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.060029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="init" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060036 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="init" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060389 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.063389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070485 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070980 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070195 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrht4" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.080094 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.112818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115903 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115983 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.119489 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.128030 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.140016 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.176335 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.178446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.183960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217346 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217386 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217442 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217462 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217516 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217697 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.218107 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.225036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.225563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.226573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.239405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320742 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.330895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.330963 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.334192 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351527 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351691 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.353580 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.355162 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.355374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.358537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.370996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.372407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.375477 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.390818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.442435 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.461582 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.520401 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.525022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.565300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.568619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.568988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.570763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.572632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.682525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.943503 4885 generic.go:334] "Generic (PLEG): container finished" podID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerID="43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c" exitCode=0 Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.943542 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerDied","Data":"43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c"} Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.956525 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.956632 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" containerID="cri-o://e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957121 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" containerID="cri-o://777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957145 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" containerID="cri-o://b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957199 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" containerID="cri-o://37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.987722 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.739569477 podStartE2EDuration="42.987707053s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:30.375314509 +0000 UTC m=+1251.771368532" lastFinishedPulling="2026-03-08 19:53:10.623452075 +0000 UTC m=+1292.019506108" observedRunningTime="2026-03-08 19:53:10.986687396 +0000 UTC m=+1292.382741419" watchObservedRunningTime="2026-03-08 19:53:10.987707053 +0000 UTC m=+1292.383761076" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.094846 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.137740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:11 crc kubenswrapper[4885]: W0308 19:53:11.156877 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42432d22_20ca_464e_be0b_e881c9ef89a7.slice/crio-eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2 WatchSource:0}: Error finding container eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2: Status 404 returned error can't find the container with id eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.202827 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:11 crc kubenswrapper[4885]: W0308 19:53:11.207436 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a083cf5_4ca2_440c_840a_6b159151609f.slice/crio-7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340 WatchSource:0}: Error finding container 7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340: Status 404 returned error can't find the container with id 7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.258721 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.322736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463644 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463678 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463703 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463940 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.464029 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.464700 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.467808 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts" (OuterVolumeSpecName: "scripts") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.469230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv" (OuterVolumeSpecName: "kube-api-access-wlqnv") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "kube-api-access-wlqnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.473205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.490257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.518500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data" (OuterVolumeSpecName: "config-data") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566187 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566225 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566238 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566248 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566259 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"004645ddace08c8f81341f8f25e364cdb5c96da8e77e37b02915c3180098f6f0"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.970240 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.970470 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979500 4885 generic.go:334] "Generic (PLEG): container finished" podID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" exitCode=0 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979603 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerStarted","Data":"eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.984210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerDied","Data":"d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005339 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005428 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.011325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"7d126567b856b73925e9e50b783a515a23fdff84d4ca27fd2089e38d86b58980"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014531 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" exitCode=2 Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014561 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" exitCode=0 Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.015496 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86998568fb-9gsxz" podStartSLOduration=3.015470899 podStartE2EDuration="3.015470899s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:12.001953527 +0000 UTC m=+1293.398007550" watchObservedRunningTime="2026-03-08 19:53:12.015470899 +0000 UTC m=+1293.411524922" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210005 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: E0308 19:53:12.210351 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210367 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210551 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.211439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224379 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x9f48" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224674 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.229170 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.296394 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.328595 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.330157 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.342726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388405 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388740 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388978 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.424588 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.426569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.428356 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.448400 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492463 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492793 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492889 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492980 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.493059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.496189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.498342 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.498708 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.500866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.501948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.513294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.533629 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595223 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595243 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595681 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595757 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595860 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.596026 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.596535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.601586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.613378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.662720 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697746 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697813 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697843 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697906 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.698278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.698638 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.701682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.713768 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.742837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.081275 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" exitCode=0 Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.081603 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.095895 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerStarted","Data":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.096317 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" containerID="cri-o://c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" gracePeriod=10 Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.096422 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.121860 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" podStartSLOduration=4.121845248 podStartE2EDuration="4.121845248s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:13.111763918 +0000 UTC m=+1294.507817941" watchObservedRunningTime="2026-03-08 19:53:13.121845248 +0000 UTC m=+1294.517899261" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.837418 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930217 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930486 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.938505 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8" (OuterVolumeSpecName: "kube-api-access-mpmg8") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "kube-api-access-mpmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.996869 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.006356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.024594 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.026501 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config" (OuterVolumeSpecName: "config") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033122 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033150 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033161 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033174 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033184 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.035021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:14 crc kubenswrapper[4885]: W0308 19:53:14.038818 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac3e4ad_92f1_4c79_bc47_5e9707b376bf.slice/crio-b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1 WatchSource:0}: Error finding container b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1: Status 404 returned error can't find the container with id b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1 Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.039889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105025 4885 generic.go:334] "Generic (PLEG): container finished" podID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" exitCode=0 Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105132 4885 scope.go:117] "RemoveContainer" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.107723 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.111490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.113019 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.134836 4885 scope.go:117] "RemoveContainer" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.136266 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.155979 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.159314 4885 scope.go:117] "RemoveContainer" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: E0308 19:53:14.161184 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": container with ID starting with c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6 not found: ID does not exist" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161235 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} err="failed to get container status \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": rpc error: code = NotFound desc = could not find container \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": container with ID starting with c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6 not found: ID does not exist" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161271 4885 scope.go:117] "RemoveContainer" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: E0308 19:53:14.161673 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": container with ID starting with a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf not found: ID does not exist" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161700 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf"} err="failed to get container status \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": rpc error: code = NotFound desc = could not find container \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": container with ID starting with a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf not found: ID does not exist" Mar 08 19:53:14 crc kubenswrapper[4885]: W0308 19:53:14.163381 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd91d80d_b465_47bc_ab15_cc9281dbb198.slice/crio-6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed WatchSource:0}: Error finding container 6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed: Status 404 returned error can't find the container with id 6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.164640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.179123 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.202576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.123293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.125142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.125185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.128194 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130606 4885 generic.go:334] "Generic (PLEG): container finished" podID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerID="41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c" exitCode=0 Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130718 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerStarted","Data":"4547972efa3892226729dccf70e00d854a9c1e79c44132cd7c28be08c974628a"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.154814 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podStartSLOduration=3.775041356 podStartE2EDuration="6.154795324s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="2026-03-08 19:53:11.167135365 +0000 UTC m=+1292.563189388" lastFinishedPulling="2026-03-08 19:53:13.546889323 +0000 UTC m=+1294.942943356" observedRunningTime="2026-03-08 19:53:15.147276684 +0000 UTC m=+1296.543330727" watchObservedRunningTime="2026-03-08 19:53:15.154795324 +0000 UTC m=+1296.550849347" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.198116 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podStartSLOduration=3.874428516 podStartE2EDuration="6.198100873s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="2026-03-08 19:53:11.218618193 +0000 UTC m=+1292.614672216" lastFinishedPulling="2026-03-08 19:53:13.54229054 +0000 UTC m=+1294.938344573" observedRunningTime="2026-03-08 19:53:15.193253764 +0000 UTC m=+1296.589307787" watchObservedRunningTime="2026-03-08 19:53:15.198100873 +0000 UTC m=+1296.594154896" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.396237 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" path="/var/lib/kubelet/pods/42432d22-20ca-464e-be0b-e881c9ef89a7/volumes" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.831158 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.148090 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerStarted","Data":"88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063"} Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.148544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.180143 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podStartSLOduration=4.180127915 podStartE2EDuration="4.180127915s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:16.177085844 +0000 UTC m=+1297.573139877" watchObservedRunningTime="2026-03-08 19:53:16.180127915 +0000 UTC m=+1297.576181928" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.469878 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:16 crc kubenswrapper[4885]: E0308 19:53:16.470253 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="init" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470271 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="init" Mar 08 19:53:16 crc kubenswrapper[4885]: E0308 19:53:16.470293 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470300 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470476 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.472743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.476108 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.476279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.478844 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485777 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587651 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.588609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.592487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.593499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689431 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689469 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693321 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.717608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.791519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.158472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5"} Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166380 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" containerID="cri-o://d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" gracePeriod=30 Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166812 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166863 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.167271 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" containerID="cri-o://3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" gracePeriod=30 Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.191967 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.191946192 podStartE2EDuration="5.191946192s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:17.190580146 +0000 UTC m=+1298.586634179" watchObservedRunningTime="2026-03-08 19:53:17.191946192 +0000 UTC m=+1298.588000225" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.332110 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.829637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012820 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.013661 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.014010 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs" (OuterVolumeSpecName: "logs") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.018767 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts" (OuterVolumeSpecName: "scripts") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.018875 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.037328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4" (OuterVolumeSpecName: "kube-api-access-58gz4") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "kube-api-access-58gz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.041371 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.061720 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data" (OuterVolumeSpecName: "config-data") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114583 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114611 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114620 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114630 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114639 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114648 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114657 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.208085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215908 4885 generic.go:334] "Generic (PLEG): container finished" podID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" exitCode=0 Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215965 4885 generic.go:334] "Generic (PLEG): container finished" podID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" exitCode=143 Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215982 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216042 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216074 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221763 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"9666e26b13c4933935ee0abcb40c76da8cace1d3e077db5278af8135676f6e1f"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.222061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.222218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.241451 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.630458131 podStartE2EDuration="6.241433117s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="2026-03-08 19:53:14.041281844 +0000 UTC m=+1295.437335867" lastFinishedPulling="2026-03-08 19:53:15.65225683 +0000 UTC m=+1297.048310853" observedRunningTime="2026-03-08 19:53:18.226197209 +0000 UTC m=+1299.622251232" watchObservedRunningTime="2026-03-08 19:53:18.241433117 +0000 UTC m=+1299.637487140" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.245777 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.272653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-796cf584f6-dfmcm" podStartSLOduration=2.27263041 podStartE2EDuration="2.27263041s" podCreationTimestamp="2026-03-08 19:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:18.250197111 +0000 UTC m=+1299.646251124" watchObservedRunningTime="2026-03-08 19:53:18.27263041 +0000 UTC m=+1299.668684433" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.280137 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.285565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.285610 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} err="failed to get container status \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.285719 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.288480 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.288535 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} err="failed to get container status \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.288565 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289177 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} err="failed to get container status \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289202 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289582 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} err="failed to get container status \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.290811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.305978 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.325745 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.326241 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.326294 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326304 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326521 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.327674 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.329241 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.330554 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.330712 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.331147 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523621 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523649 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.524018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.626177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.626267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.630306 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.630797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.631175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.631971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.633318 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.633336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.647854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.944855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:19 crc kubenswrapper[4885]: I0308 19:53:19.378978 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" path="/var/lib/kubelet/pods/bd91d80d-b465-47bc-ab15-cc9281dbb198/volumes" Mar 08 19:53:19 crc kubenswrapper[4885]: I0308 19:53:19.734143 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:20 crc kubenswrapper[4885]: I0308 19:53:20.239535 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:53:20 crc kubenswrapper[4885]: I0308 19:53:20.256642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"f56864cf75c0bf77d3e8eee5fd2c82834b4c4219c0b0d60077918b5a5fcf0612"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.036769 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.037121 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.267675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.267718 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.268841 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.287232 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.287211182 podStartE2EDuration="3.287211182s" podCreationTimestamp="2026-03-08 19:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:21.286322798 +0000 UTC m=+1302.682376851" watchObservedRunningTime="2026-03-08 19:53:21.287211182 +0000 UTC m=+1302.683265205" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.528898 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.771672 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.771993 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" containerID="cri-o://f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" gracePeriod=30 Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.772129 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" containerID="cri-o://0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" gracePeriod=30 Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.800635 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.801935 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.802185 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.820470 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.897739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.897959 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898478 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000544 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000686 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.010396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.010410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.015988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.021833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.025672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.025808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.027369 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.122173 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.286995 4885 generic.go:334] "Generic (PLEG): container finished" podID="4de3b511-619d-4637-ac70-f7e555976c0e" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" exitCode=0 Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.288040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.534518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.665112 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.755564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.755779 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" containerID="cri-o://054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" gracePeriod=10 Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.781873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:22 crc kubenswrapper[4885]: W0308 19:53:22.811301 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b91750_253e_46eb_9a1c_f7208dab2496.slice/crio-021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc WatchSource:0}: Error finding container 021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc: Status 404 returned error can't find the container with id 021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.931327 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.355132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.355454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.359685 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerID="054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" exitCode=0 Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.360774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.413383 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.414059 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.551758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552141 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.556569 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw" (OuterVolumeSpecName: "kube-api-access-mkbcw") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "kube-api-access-mkbcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.599076 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.603179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.607497 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config" (OuterVolumeSpecName: "config") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.619466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.620123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655062 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655094 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655114 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655123 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655133 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655141 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371325 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"d542f54b493f5cfa9eb7867bd9658f59584a89f76c6986f57f0a54179d70ccd3"} Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371470 4885 scope.go:117] "RemoveContainer" containerID="054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371886 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47"} Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373665 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" containerID="cri-o://aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" gracePeriod=30 Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373730 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" containerID="cri-o://dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" gracePeriod=30 Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.398701 4885 scope.go:117] "RemoveContainer" containerID="695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.435788 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bb5b9c587-nd8hp" podStartSLOduration=3.435764865 podStartE2EDuration="3.435764865s" podCreationTimestamp="2026-03-08 19:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:24.422437358 +0000 UTC m=+1305.818491391" watchObservedRunningTime="2026-03-08 19:53:24.435764865 +0000 UTC m=+1305.831818898" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.450126 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.459308 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.378838 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" path="/var/lib/kubelet/pods/3c32ee1c-ff69-4043-a31d-92be1d77a404/volumes" Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.386128 4885 generic.go:334] "Generic (PLEG): container finished" podID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerID="dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" exitCode=0 Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.386198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2"} Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.387447 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.105336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:27 crc kubenswrapper[4885]: E0308 19:53:27.361802 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="init" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361818 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="init" Mar 08 19:53:27 crc kubenswrapper[4885]: E0308 19:53:27.361847 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.371175 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.376363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.387750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520899 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.521009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622645 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622782 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.623448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.628880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.629669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.631431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.632152 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.640078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.648776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.702980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.242148 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.295526 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.437912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438715 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438869 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.443706 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.453368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz" (OuterVolumeSpecName: "kube-api-access-676dz") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "kube-api-access-676dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.467340 4885 generic.go:334] "Generic (PLEG): container finished" podID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerID="aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" exitCode=0 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.467440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.471007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473121 4885 generic.go:334] "Generic (PLEG): container finished" podID="4de3b511-619d-4637-ac70-f7e555976c0e" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" exitCode=0 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473170 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"276c039d660964960e48e559165b8b647e2ab8ab57d7b59f8062379f583a0dc6"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473340 4885 scope.go:117] "RemoveContainer" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.489043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.503915 4885 scope.go:117] "RemoveContainer" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.528829 4885 scope.go:117] "RemoveContainer" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: E0308 19:53:28.529320 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": container with ID starting with 0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c not found: ID does not exist" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529361 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} err="failed to get container status \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": rpc error: code = NotFound desc = could not find container \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": container with ID starting with 0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c not found: ID does not exist" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529385 4885 scope.go:117] "RemoveContainer" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: E0308 19:53:28.529751 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": container with ID starting with f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304 not found: ID does not exist" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529779 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} err="failed to get container status \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": rpc error: code = NotFound desc = could not find container \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": container with ID starting with f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304 not found: ID does not exist" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.540761 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.540788 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.558598 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.607848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.617248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config" (OuterVolumeSpecName: "config") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.624547 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.629282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642808 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642826 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642834 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.651600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.698587 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.698629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.699226 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" containerID="cri-o://aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" gracePeriod=30 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.699452 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" containerID="cri-o://49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" gracePeriod=30 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748830 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749457 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749470 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.757066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw" (OuterVolumeSpecName: "kube-api-access-2zwcw") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "kube-api-access-2zwcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.765777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.766200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts" (OuterVolumeSpecName: "scripts") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.848891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851157 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851196 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851211 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851224 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851235 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.910527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data" (OuterVolumeSpecName: "config-data") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.921052 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.932781 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.955519 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.333729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.339228 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.408492 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" path="/var/lib/kubelet/pods/4de3b511-619d-4637-ac70-f7e555976c0e/volumes" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.497493 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" exitCode=143 Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.497589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.502023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.502066 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.503490 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.503520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.512448 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.513005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.513039 4885 scope.go:117] "RemoveContainer" containerID="dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.537896 4885 scope.go:117] "RemoveContainer" containerID="aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.540421 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58c657b6d6-r4tf7" podStartSLOduration=2.540382194 podStartE2EDuration="2.540382194s" podCreationTimestamp="2026-03-08 19:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:29.529337569 +0000 UTC m=+1310.925391592" watchObservedRunningTime="2026-03-08 19:53:29.540382194 +0000 UTC m=+1310.936436217" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.573468 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.585783 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.599671 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600080 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600098 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600117 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600124 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600140 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600148 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600166 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600172 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600330 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600342 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600354 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600362 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.601730 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.604985 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.607514 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.770019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871438 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871470 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.874122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.878198 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.879239 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.880681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.880956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.893137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.932872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:30 crc kubenswrapper[4885]: I0308 19:53:30.422828 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:30 crc kubenswrapper[4885]: I0308 19:53:30.541661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"4d0a6fa6c058e8e3d990a208a072ec9c1c565777e02360b62370fc36d2e37246"} Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.234773 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.380893 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" path="/var/lib/kubelet/pods/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf/volumes" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.551433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.873000 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:45958->10.217.0.163:9311: read: connection reset by peer" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.873000 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:45972->10.217.0.163:9311: read: connection reset by peer" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.268459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473682 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473715 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473770 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.476437 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs" (OuterVolumeSpecName: "logs") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.489307 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.494143 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf" (OuterVolumeSpecName: "kube-api-access-5nclf") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "kube-api-access-5nclf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.509518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.539055 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data" (OuterVolumeSpecName: "config-data") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561548 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" exitCode=0 Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561622 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"004645ddace08c8f81341f8f25e364cdb5c96da8e77e37b02915c3180098f6f0"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561664 4885 scope.go:117] "RemoveContainer" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561790 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.563688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.583729 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584247 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584265 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584277 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584286 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.585449 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.58543272 podStartE2EDuration="3.58543272s" podCreationTimestamp="2026-03-08 19:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:32.58242372 +0000 UTC m=+1313.978477733" watchObservedRunningTime="2026-03-08 19:53:32.58543272 +0000 UTC m=+1313.981486743" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.621421 4885 scope.go:117] "RemoveContainer" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.633409 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637346 4885 scope.go:117] "RemoveContainer" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: E0308 19:53:32.637810 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": container with ID starting with 49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f not found: ID does not exist" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637837 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} err="failed to get container status \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": rpc error: code = NotFound desc = could not find container \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": container with ID starting with 49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f not found: ID does not exist" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637873 4885 scope.go:117] "RemoveContainer" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: E0308 19:53:32.638153 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": container with ID starting with aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd not found: ID does not exist" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.638230 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} err="failed to get container status \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": rpc error: code = NotFound desc = could not find container \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": container with ID starting with aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd not found: ID does not exist" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.641615 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.818713 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.818770 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:53:33 crc kubenswrapper[4885]: I0308 19:53:33.407795 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" path="/var/lib/kubelet/pods/5d5033f1-b303-4891-875f-8f9bcb7585c0/volumes" Mar 08 19:53:33 crc kubenswrapper[4885]: I0308 19:53:33.491716 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:34 crc kubenswrapper[4885]: I0308 19:53:34.933328 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.300768 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:37 crc kubenswrapper[4885]: E0308 19:53:37.301789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.301807 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: E0308 19:53:37.301827 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.301836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.302059 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.302088 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.303912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.305673 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.307219 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.307469 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.335505 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384632 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384663 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486505 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486866 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.487478 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.487496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.494588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.494858 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.495373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.496066 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.499774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.510794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.623871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:38 crc kubenswrapper[4885]: W0308 19:53:38.176586 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60f9821e_e554_4594_bfb2_9521cd3c171a.slice/crio-a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc WatchSource:0}: Error finding container a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc: Status 404 returned error can't find the container with id a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.177451 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.408681 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.410552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.413635 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.414528 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.415070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r9g5q" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.419244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.503951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504335 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606449 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.607458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.612622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.612838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.636153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.639303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124"} Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.639352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc"} Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.723087 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.723887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.729192 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.900654 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.902802 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912386 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.926101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: E0308 19:53:38.964692 4885 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 19:53:38 crc kubenswrapper[4885]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c6c74f05-881e-48c0-82d2-d90356ad15eb_0(ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404" Netns:"/var/run/netns/2ba25179-213b-4737-9a9a-c6d6279fe0a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404;K8S_POD_UID=c6c74f05-881e-48c0-82d2-d90356ad15eb" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c6c74f05-881e-48c0-82d2-d90356ad15eb]: expected pod UID "c6c74f05-881e-48c0-82d2-d90356ad15eb" but got "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" from Kube API Mar 08 19:53:38 crc kubenswrapper[4885]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 19:53:38 crc kubenswrapper[4885]: > Mar 08 19:53:38 crc kubenswrapper[4885]: E0308 19:53:38.964759 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 19:53:38 crc kubenswrapper[4885]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c6c74f05-881e-48c0-82d2-d90356ad15eb_0(ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404" Netns:"/var/run/netns/2ba25179-213b-4737-9a9a-c6d6279fe0a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404;K8S_POD_UID=c6c74f05-881e-48c0-82d2-d90356ad15eb" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c6c74f05-881e-48c0-82d2-d90356ad15eb]: expected pod UID "c6c74f05-881e-48c0-82d2-d90356ad15eb" but got "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" from Kube API Mar 08 19:53:38 crc kubenswrapper[4885]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 19:53:38 crc kubenswrapper[4885]: > pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016289 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016358 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016491 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016548 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.017755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.027805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.040338 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.042702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.223785 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.648539 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649071 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b"} Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649146 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.655192 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:39 crc kubenswrapper[4885]: W0308 19:53:39.674034 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd62ab9_bb59_47ef_b639_fd0a0a4c4b84.slice/crio-b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c WatchSource:0}: Error finding container b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c: Status 404 returned error can't find the container with id b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.680212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-744484b5fc-g6mjz" podStartSLOduration=2.680189896 podStartE2EDuration="2.680189896s" podCreationTimestamp="2026-03-08 19:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:39.675955963 +0000 UTC m=+1321.072009986" watchObservedRunningTime="2026-03-08 19:53:39.680189896 +0000 UTC m=+1321.076243919" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.699839 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.702495 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.730297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.731116 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.741039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5" (OuterVolumeSpecName: "kube-api-access-c6ml5") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "kube-api-access-c6ml5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.752058 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.752111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.832961 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.832996 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.833009 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.157276 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.660836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84","Type":"ContainerStarted","Data":"b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c"} Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.660884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.677573 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.381351 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" path="/var/lib/kubelet/pods/c6c74f05-881e-48c0-82d2-d90356ad15eb/volumes" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.465753 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566331 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566729 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567365 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567773 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567840 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.588606 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts" (OuterVolumeSpecName: "scripts") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.589870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5" (OuterVolumeSpecName: "kube-api-access-5bkw5") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "kube-api-access-5bkw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.598155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.650615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.668857 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669253 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669316 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669382 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669444 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669503 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676252 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" exitCode=137 Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"7df05fec0614fac93aab8ae29777fc59abe79ffd7aca5937f60cd7d723e3ec63"} Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676388 4885 scope.go:117] "RemoveContainer" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676506 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.687442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data" (OuterVolumeSpecName: "config-data") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.697397 4885 scope.go:117] "RemoveContainer" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.751555 4885 scope.go:117] "RemoveContainer" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.772150 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.773739 4885 scope.go:117] "RemoveContainer" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793296 4885 scope.go:117] "RemoveContainer" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.793735 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": container with ID starting with 777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d not found: ID does not exist" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793779 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} err="failed to get container status \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": rpc error: code = NotFound desc = could not find container \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": container with ID starting with 777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793806 4885 scope.go:117] "RemoveContainer" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.794174 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": container with ID starting with 37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b not found: ID does not exist" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794231 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} err="failed to get container status \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": rpc error: code = NotFound desc = could not find container \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": container with ID starting with 37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794265 4885 scope.go:117] "RemoveContainer" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.794741 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": container with ID starting with b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2 not found: ID does not exist" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794774 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} err="failed to get container status \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": rpc error: code = NotFound desc = could not find container \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": container with ID starting with b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2 not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794793 4885 scope.go:117] "RemoveContainer" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.795028 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": container with ID starting with e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f not found: ID does not exist" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.795059 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} err="failed to get container status \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": rpc error: code = NotFound desc = could not find container \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": container with ID starting with e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f not found: ID does not exist" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.024451 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.042267 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052250 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052646 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052666 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052682 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052711 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052719 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052729 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052734 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052892 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052907 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052933 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052947 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.054469 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.056757 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.056839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.063309 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.183835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.183992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184509 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285787 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285887 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286052 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.292781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.295669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.297377 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.309836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.323492 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.418650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.882277 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.382508 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" path="/var/lib/kubelet/pods/624830da-2b73-4843-bb04-6db9c1a7b281/volumes" Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.701333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.701631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"ff053af0de43567bfe97eb303e2667a54755259da1bbe8ed4301748389714730"} Mar 08 19:53:44 crc kubenswrapper[4885]: I0308 19:53:44.711189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} Mar 08 19:53:47 crc kubenswrapper[4885]: I0308 19:53:47.629007 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:47 crc kubenswrapper[4885]: I0308 19:53:47.637209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:48 crc kubenswrapper[4885]: I0308 19:53:48.392026 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:51 crc kubenswrapper[4885]: I0308 19:53:51.810753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} Mar 08 19:53:51 crc kubenswrapper[4885]: I0308 19:53:51.828125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84","Type":"ContainerStarted","Data":"6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3"} Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.146389 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.165347 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.065946552 podStartE2EDuration="14.165326604s" podCreationTimestamp="2026-03-08 19:53:38 +0000 UTC" firstStartedPulling="2026-03-08 19:53:39.677437793 +0000 UTC m=+1321.073491816" lastFinishedPulling="2026-03-08 19:53:50.776817845 +0000 UTC m=+1332.172871868" observedRunningTime="2026-03-08 19:53:51.849372916 +0000 UTC m=+1333.245426939" watchObservedRunningTime="2026-03-08 19:53:52.165326604 +0000 UTC m=+1333.561380627" Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216065 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216291 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbc5d6644-tztss" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" containerID="cri-o://5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" gracePeriod=30 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216401 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbc5d6644-tztss" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" containerID="cri-o://47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" gracePeriod=30 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.837863 4885 generic.go:334] "Generic (PLEG): container finished" podID="de714834-e155-41c1-83fc-a050203bde75" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" exitCode=0 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.838409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" containerID="cri-o://75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846540 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" containerID="cri-o://e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846692 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" containerID="cri-o://4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846688 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" containerID="cri-o://ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.867366 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.028266946 podStartE2EDuration="11.867350117s" podCreationTimestamp="2026-03-08 19:53:42 +0000 UTC" firstStartedPulling="2026-03-08 19:53:42.895086895 +0000 UTC m=+1324.291140918" lastFinishedPulling="2026-03-08 19:53:52.734170066 +0000 UTC m=+1334.130224089" observedRunningTime="2026-03-08 19:53:53.864302446 +0000 UTC m=+1335.260356469" watchObservedRunningTime="2026-03-08 19:53:53.867350117 +0000 UTC m=+1335.263404130" Mar 08 19:53:53 crc kubenswrapper[4885]: E0308 19:53:53.887580 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9ffb05_57d2_4576_ac6a_a1d29fd8cfc2.slice/crio-75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.755734 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855486 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855518 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" exitCode=2 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855526 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855533 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855572 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855590 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"ff053af0de43567bfe97eb303e2667a54755259da1bbe8ed4301748389714730"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855611 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.877987 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.904584 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924698 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924818 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925739 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.926137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.928198 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.930394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts" (OuterVolumeSpecName: "scripts") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.935232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7" (OuterVolumeSpecName: "kube-api-access-54bx7") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "kube-api-access-54bx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.957691 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.958138 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958180 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958201 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.958804 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958832 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958850 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.964141 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964174 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964193 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.964565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964594 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964612 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965389 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965466 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.966558 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.966587 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967152 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967521 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967548 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967850 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967876 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968180 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968207 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968440 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968464 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968883 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968906 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969207 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969299 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969806 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969834 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970379 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970438 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970793 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.015138 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027291 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027323 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027333 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027345 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027353 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027363 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.040075 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data" (OuterVolumeSpecName: "config-data") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.128659 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.185715 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.192907 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210314 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210761 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210783 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210829 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210838 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210856 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210862 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211074 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211109 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211124 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.212982 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.217304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.217320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.225382 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333192 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.382618 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" path="/var/lib/kubelet/pods/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2/volumes" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435710 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435767 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.440881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.441123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.441834 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.442582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.465772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.485934 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.486951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.501494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.575532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.607639 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.609148 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.638407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.650502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.650566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.697863 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.699298 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.701274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.705714 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.753026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.780521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.801508 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.802933 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.808697 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854310 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854346 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854430 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.855106 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.866990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.888668 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.909608 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.910789 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.913413 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.935215 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.945340 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.960256 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.976855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.035935 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063530 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.064587 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.083809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.092485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.108069 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.109260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.111975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.120220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.123215 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.165714 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.165766 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.166439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.184608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.230844 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.267171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.267232 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.368679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.368736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.369607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.385306 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.479638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.498363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.599441 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.646379 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.664736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.921131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerStarted","Data":"c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.922535 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerStarted","Data":"4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.923798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerStarted","Data":"3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.926389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.929028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerStarted","Data":"4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.929052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerStarted","Data":"06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.945347 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-b5mql" podStartSLOduration=1.9453322229999999 podStartE2EDuration="1.945332223s" podCreationTimestamp="2026-03-08 19:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:56.943674119 +0000 UTC m=+1338.339728142" watchObservedRunningTime="2026-03-08 19:53:56.945332223 +0000 UTC m=+1338.341386246" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.983388 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.144220 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.454353 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611821 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611903 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.612013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.612052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.616425 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.616492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t" (OuterVolumeSpecName: "kube-api-access-xfv9t") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "kube-api-access-xfv9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.692683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.694054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config" (OuterVolumeSpecName: "config") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.713998 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714244 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714331 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714398 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.734610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.816253 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938211 4885 generic.go:334] "Generic (PLEG): container finished" podID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerID="83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerDied","Data":"83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938305 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerStarted","Data":"76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.939369 4885 generic.go:334] "Generic (PLEG): container finished" podID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerID="9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.939446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerDied","Data":"9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.941570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.941617 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.943396 4885 generic.go:334] "Generic (PLEG): container finished" podID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerID="ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.943454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerDied","Data":"ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.945026 4885 generic.go:334] "Generic (PLEG): container finished" podID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerID="eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.945186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerDied","Data":"eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.946837 4885 generic.go:334] "Generic (PLEG): container finished" podID="11e75774-c86c-459a-9c66-eaf3c43addac" containerID="0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.946972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerDied","Data":"0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.947089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerStarted","Data":"8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.948381 4885 generic.go:334] "Generic (PLEG): container finished" podID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerID="4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.948440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerDied","Data":"4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950436 4885 generic.go:334] "Generic (PLEG): container finished" podID="de714834-e155-41c1-83fc-a050203bde75" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950487 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"ce2e04e0c937557f90cd3e8f47d07607bc0f9d0c6eb93b3ee180bc8da569c97b"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950507 4885 scope.go:117] "RemoveContainer" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.041495 4885 scope.go:117] "RemoveContainer" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.068769 4885 scope.go:117] "RemoveContainer" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:58 crc kubenswrapper[4885]: E0308 19:53:58.069903 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": container with ID starting with 47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab not found: ID does not exist" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.069950 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} err="failed to get container status \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": rpc error: code = NotFound desc = could not find container \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": container with ID starting with 47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab not found: ID does not exist" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.069967 4885 scope.go:117] "RemoveContainer" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: E0308 19:53:58.072348 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": container with ID starting with 5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b not found: ID does not exist" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.072385 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} err="failed to get container status \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": rpc error: code = NotFound desc = could not find container \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": container with ID starting with 5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b not found: ID does not exist" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.086220 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.093487 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.723861 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.765402 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.845648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.845896 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5685698-p87pb" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" containerID="cri-o://284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" gracePeriod=30 Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.846009 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5685698-p87pb" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" containerID="cri-o://4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" gracePeriod=30 Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.993889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9"} Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.383687 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de714834-e155-41c1-83fc-a050203bde75" path="/var/lib/kubelet/pods/de714834-e155-41c1-83fc-a050203bde75/volumes" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.399033 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.549419 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"11e75774-c86c-459a-9c66-eaf3c43addac\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.549855 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"11e75774-c86c-459a-9c66-eaf3c43addac\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.550214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11e75774-c86c-459a-9c66-eaf3c43addac" (UID: "11e75774-c86c-459a-9c66-eaf3c43addac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.552113 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.559144 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7" (OuterVolumeSpecName: "kube-api-access-ht8t7") pod "11e75774-c86c-459a-9c66-eaf3c43addac" (UID: "11e75774-c86c-459a-9c66-eaf3c43addac"). InnerVolumeSpecName "kube-api-access-ht8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.656115 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.711199 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.724034 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.733894 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.743520 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.840623 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870610 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870756 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" (UID: "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873324 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" (UID: "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873639 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" (UID: "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" (UID: "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.881445 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t" (OuterVolumeSpecName: "kube-api-access-g4g7t") pod "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" (UID: "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c"). InnerVolumeSpecName "kube-api-access-g4g7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.881856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw" (OuterVolumeSpecName: "kube-api-access-qcwfw") pod "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" (UID: "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7"). InnerVolumeSpecName "kube-api-access-qcwfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.890645 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk" (OuterVolumeSpecName: "kube-api-access-dfljk") pod "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" (UID: "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41"). InnerVolumeSpecName "kube-api-access-dfljk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.891147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl" (OuterVolumeSpecName: "kube-api-access-2sqzl") pod "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" (UID: "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3"). InnerVolumeSpecName "kube-api-access-2sqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.973155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"92191eaa-0c0a-4927-adf4-a4e386ed2552\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.974164 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"92191eaa-0c0a-4927-adf4-a4e386ed2552\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976002 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976027 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976039 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976050 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976058 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976067 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976075 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976104 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.978187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92191eaa-0c0a-4927-adf4-a4e386ed2552" (UID: "92191eaa-0c0a-4927-adf4-a4e386ed2552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.981019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf" (OuterVolumeSpecName: "kube-api-access-plmtf") pod "92191eaa-0c0a-4927-adf4-a4e386ed2552" (UID: "92191eaa-0c0a-4927-adf4-a4e386ed2552"). InnerVolumeSpecName "kube-api-access-plmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009143 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerDied","Data":"4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009345 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010908 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerDied","Data":"8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010977 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerDied","Data":"06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013188 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013196 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerDied","Data":"76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017492 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.029694 4885 generic.go:334] "Generic (PLEG): container finished" podID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerID="284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" exitCode=143 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.029820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerDied","Data":"3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037511 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037578 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerDied","Data":"c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039579 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039623 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.077711 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.077785 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135102 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135461 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135479 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135489 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135496 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135508 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135514 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135540 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135549 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135554 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135571 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135577 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135589 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135595 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135607 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135613 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135761 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135774 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135785 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135797 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135807 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135839 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.136406 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.139790 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.139966 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.140590 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.147040 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.250437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.251017 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" containerID="cri-o://5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" gracePeriod=30 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.251465 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" containerID="cri-o://bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" gracePeriod=30 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.281255 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.383029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.400805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.463223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.917523 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.064529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.064783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.067023 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" exitCode=143 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.067068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.068981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerStarted","Data":"556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.091413 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.250901076 podStartE2EDuration="6.091396461s" podCreationTimestamp="2026-03-08 19:53:55 +0000 UTC" firstStartedPulling="2026-03-08 19:53:56.126449137 +0000 UTC m=+1337.522503160" lastFinishedPulling="2026-03-08 19:53:59.966944522 +0000 UTC m=+1341.362998545" observedRunningTime="2026-03-08 19:54:01.085135153 +0000 UTC m=+1342.481189186" watchObservedRunningTime="2026-03-08 19:54:01.091396461 +0000 UTC m=+1342.487450484" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.322401 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.323498 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326174 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326351 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrg8t" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.340072 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403452 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.505012 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.513758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.515791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.527775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.528635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.550667 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.550997 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" containerID="cri-o://bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" gracePeriod=30 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.551079 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" containerID="cri-o://76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" gracePeriod=30 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.678459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.027700 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.079833 4885 generic.go:334] "Generic (PLEG): container finished" podID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerID="4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" exitCode=0 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.079879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a"} Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.081648 4885 generic.go:334] "Generic (PLEG): container finished" podID="405b5d21-a208-4f86-b046-66968c326aa4" containerID="bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" exitCode=143 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.082126 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b"} Mar 08 19:54:02 crc kubenswrapper[4885]: W0308 19:54:02.158202 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84470b78_5e74_473c_88d3_5343943c01fb.slice/crio-ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25 WatchSource:0}: Error finding container ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25: Status 404 returned error can't find the container with id ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.158507 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.370357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523048 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523160 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523294 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523344 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.525831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs" (OuterVolumeSpecName: "logs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.534080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts" (OuterVolumeSpecName: "scripts") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.541455 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs" (OuterVolumeSpecName: "kube-api-access-dq6bs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "kube-api-access-dq6bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.601077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.621132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data" (OuterVolumeSpecName: "config-data") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627909 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627970 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627980 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627989 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627997 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.649778 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.688077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.729496 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.729534 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818068 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818119 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818157 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818801 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818850 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" gracePeriod=600 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093727 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" exitCode=0 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093787 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093819 4885 scope.go:117] "RemoveContainer" containerID="e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.095947 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerID="0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385" exitCode=0 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.095998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerDied","Data":"0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.098556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.098592 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105711 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerStarted","Data":"ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105845 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" containerID="cri-o://9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105861 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" containerID="cri-o://7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105885 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" containerID="cri-o://30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" containerID="cri-o://db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.141629 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.146648 4885 scope.go:117] "RemoveContainer" containerID="4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.147960 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.186717 4885 scope.go:117] "RemoveContainer" containerID="284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.380799 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" path="/var/lib/kubelet/pods/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1/volumes" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:03.999943 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.123489 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"4a7d8fc5b878c3d0d7e4f116b765cb88438382af5862067e0b9a50b02bb40fea"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124795 4885 scope.go:117] "RemoveContainer" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.128340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148726 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148759 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" exitCode=2 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148768 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152534 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152565 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.153945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs" (OuterVolumeSpecName: "logs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.155325 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.158851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts" (OuterVolumeSpecName: "scripts") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.159247 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm" (OuterVolumeSpecName: "kube-api-access-hzlbm") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "kube-api-access-hzlbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.160391 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.198187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.200832 4885 scope.go:117] "RemoveContainer" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.212747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data" (OuterVolumeSpecName: "config-data") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: W0308 19:54:04.256385 4885 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3c045d9d-f8a0-40b9-9600-0d10d5c699e7/volumes/kubernetes.io~secret/public-tls-certs Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256415 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257731 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257761 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257771 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258207 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258229 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258244 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258265 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258276 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265158 4885 scope.go:117] "RemoveContainer" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.265735 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": container with ID starting with bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2 not found: ID does not exist" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265781 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} err="failed to get container status \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": rpc error: code = NotFound desc = could not find container \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": container with ID starting with bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2 not found: ID does not exist" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265834 4885 scope.go:117] "RemoveContainer" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.266339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": container with ID starting with 5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144 not found: ID does not exist" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.266389 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} err="failed to get container status \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": rpc error: code = NotFound desc = could not find container \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": container with ID starting with 5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144 not found: ID does not exist" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.286031 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.361076 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.495255 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.520019 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.535537 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.565929 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565966 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.565986 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565992 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566007 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566013 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566023 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566028 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566039 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566231 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566259 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566268 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566281 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"5e836afb-bb6f-4e67-9df6-5bef0273a523\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.567994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.570047 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.570350 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.571377 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f" (OuterVolumeSpecName: "kube-api-access-ng24f") pod "5e836afb-bb6f-4e67-9df6-5bef0273a523" (UID: "5e836afb-bb6f-4e67-9df6-5bef0273a523"). InnerVolumeSpecName "kube-api-access-ng24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.603051 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668138 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668985 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770123 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770952 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.771486 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.774311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.774736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.775149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.780399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.790504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.793852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.797125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.934349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.156785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerDied","Data":"556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b"} Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.157086 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.157018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.160939 4885 generic.go:334] "Generic (PLEG): container finished" podID="405b5d21-a208-4f86-b046-66968c326aa4" containerID="76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" exitCode=0 Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.160966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd"} Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.164464 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285505 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285597 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285838 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285910 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.288946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs" (OuterVolumeSpecName: "logs") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.289123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.292419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts" (OuterVolumeSpecName: "scripts") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.292571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv" (OuterVolumeSpecName: "kube-api-access-vkzrv") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "kube-api-access-vkzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.294269 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.345090 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.351678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data" (OuterVolumeSpecName: "config-data") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.375847 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.386340 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" path="/var/lib/kubelet/pods/3c045d9d-f8a0-40b9-9600-0d10d5c699e7/volumes" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389540 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389567 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389578 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389589 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389597 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389605 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389635 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389644 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.419514 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.483967 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.491438 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.580734 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.589296 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.172966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.173278 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"8290e58829785cfd7645e5b7ea06bfd203515f9adadd2b8e8b4383fbc9129293"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"0ea29bb519254f2f6d232c0a073b9e8199e006c424f0d72c1ebe3ec8f1381dff"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175399 4885 scope.go:117] "RemoveContainer" containerID="76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175488 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.208489 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.211420 4885 scope.go:117] "RemoveContainer" containerID="bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.226188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.261332 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: E0308 19:54:06.262934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.262950 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: E0308 19:54:06.262992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.262999 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.267003 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.267040 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.269397 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.272182 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.272279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.292687 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408458 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408561 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.509965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510360 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510386 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510459 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510809 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.511442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.511491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517221 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.526599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.529433 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.546867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.609138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.137697 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.193854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852"} Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.198220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"d206b01c706625f3b6d24a81cff0491b35107018670692e53d89b4cfafe0b053"} Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.220795 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.220774262 podStartE2EDuration="3.220774262s" podCreationTimestamp="2026-03-08 19:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:07.215331737 +0000 UTC m=+1348.611385840" watchObservedRunningTime="2026-03-08 19:54:07.220774262 +0000 UTC m=+1348.616828295" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.380394 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405b5d21-a208-4f86-b046-66968c326aa4" path="/var/lib/kubelet/pods/405b5d21-a208-4f86-b046-66968c326aa4/volumes" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.381346 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" path="/var/lib/kubelet/pods/a1daba97-3389-4e45-8a6c-bf910619f315/volumes" Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.222468 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" exitCode=0 Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.222561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb"} Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.225774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e"} Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.269155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4"} Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.269690 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.309056 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426507 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426573 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426756 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.428131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.428707 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.429381 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.429416 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.432479 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk" (OuterVolumeSpecName: "kube-api-access-72jnk") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "kube-api-access-72jnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.440109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts" (OuterVolumeSpecName: "scripts") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.454613 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.502004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.527808 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data" (OuterVolumeSpecName: "config-data") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531325 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531337 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531349 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531360 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.282887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871"} Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.288675 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.294181 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerStarted","Data":"5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5"} Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.335662 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.335644207 podStartE2EDuration="7.335644207s" podCreationTimestamp="2026-03-08 19:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:13.324353384 +0000 UTC m=+1354.720407407" watchObservedRunningTime="2026-03-08 19:54:13.335644207 +0000 UTC m=+1354.731698220" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.361560 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.392697 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" podStartSLOduration=2.328384238 podStartE2EDuration="12.392682971s" podCreationTimestamp="2026-03-08 19:54:01 +0000 UTC" firstStartedPulling="2026-03-08 19:54:02.161944217 +0000 UTC m=+1343.557998230" lastFinishedPulling="2026-03-08 19:54:12.22624294 +0000 UTC m=+1353.622296963" observedRunningTime="2026-03-08 19:54:13.390786521 +0000 UTC m=+1354.786840554" watchObservedRunningTime="2026-03-08 19:54:13.392682971 +0000 UTC m=+1354.788736994" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.396416 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.414627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415134 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415155 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415170 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415177 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415208 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415216 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415237 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415244 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415446 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415465 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415483 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415496 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.417481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.420194 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.421517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.456047 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.550867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551705 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653015 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653783 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.654181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.659032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.659262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.660249 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.665547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.687878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.757651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.228757 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.297110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"5306967bbbb924e179bb457b15fdbc54377a2dfcd6df23e85eb070929ec038ff"} Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.934850 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.935198 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.973437 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.981281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2"} Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306637 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306788 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.382173 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" path="/var/lib/kubelet/pods/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb/volumes" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.621145 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.316788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45"} Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.609391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.609451 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.648518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.652169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4"} Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336411 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.380979 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.381042 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.348563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e"} Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.348934 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" containerID="cri-o://1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349259 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" containerID="cri-o://682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349293 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" containerID="cri-o://91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349339 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" containerID="cri-o://e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.390010 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.816187066 podStartE2EDuration="5.389988271s" podCreationTimestamp="2026-03-08 19:54:13 +0000 UTC" firstStartedPulling="2026-03-08 19:54:14.238079188 +0000 UTC m=+1355.634133211" lastFinishedPulling="2026-03-08 19:54:17.811880393 +0000 UTC m=+1359.207934416" observedRunningTime="2026-03-08 19:54:18.376222224 +0000 UTC m=+1359.772276287" watchObservedRunningTime="2026-03-08 19:54:18.389988271 +0000 UTC m=+1359.786042304" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.286993 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.293645 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362127 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" exitCode=0 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362167 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" exitCode=2 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362176 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" exitCode=0 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362249 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e"} Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4"} Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45"} Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.391451 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" exitCode=0 Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.391539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2"} Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.694670 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727346 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727705 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728626 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728697 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728737 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728803 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.729524 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.729575 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.733083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv" (OuterVolumeSpecName: "kube-api-access-v5pqv") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "kube-api-access-v5pqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.733967 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts" (OuterVolumeSpecName: "scripts") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.772466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.811091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.830967 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.830999 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831012 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831026 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.833240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data" (OuterVolumeSpecName: "config-data") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.933138 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.414062 4885 generic.go:334] "Generic (PLEG): container finished" podID="84470b78-5e74-473c-88d3-5343943c01fb" containerID="5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5" exitCode=0 Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.414163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerDied","Data":"5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5"} Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423010 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"5306967bbbb924e179bb457b15fdbc54377a2dfcd6df23e85eb070929ec038ff"} Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423084 4885 scope.go:117] "RemoveContainer" containerID="682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423280 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.462261 4885 scope.go:117] "RemoveContainer" containerID="e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.481687 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.518648 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.531478 4885 scope.go:117] "RemoveContainer" containerID="91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.552680 4885 scope.go:117] "RemoveContainer" containerID="1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.554963 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555489 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555507 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555528 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555536 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555561 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555569 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555595 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555604 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555955 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555977 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555996 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.556007 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.558279 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.560770 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.565021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.574614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.751334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.751869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.771611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.771623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.772680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.774522 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.804986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.891786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.367306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.434505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"031cf273d2ce6f416df1db118df4e04e2598121142c98562d1d0691ec1ae6950"} Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.815504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878752 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.886284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj" (OuterVolumeSpecName: "kube-api-access-66qmj") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "kube-api-access-66qmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.887260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts" (OuterVolumeSpecName: "scripts") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.912308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.912573 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data" (OuterVolumeSpecName: "config-data") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.980960 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981350 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981376 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981397 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.379799 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b37336-b51a-477c-90c6-78242b1e301a" path="/var/lib/kubelet/pods/54b37336-b51a-477c-90c6-78242b1e301a/volumes" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerDied","Data":"ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25"} Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463132 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463207 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.465836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.517337 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:25 crc kubenswrapper[4885]: E0308 19:54:25.517755 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.517773 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.518004 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.522101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.524839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.524928 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrg8t" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.543907 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592162 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.693854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.694330 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.694373 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.698671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.700416 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.722414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.840608 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.275430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:26 crc kubenswrapper[4885]: W0308 19:54:26.278773 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd9ad85_0e13_4d1f_ab0e_ffd5630c6197.slice/crio-e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9 WatchSource:0}: Error finding container e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9: Status 404 returned error can't find the container with id e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9 Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.479012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.479281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.485493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerStarted","Data":"e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9"} Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.500676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerStarted","Data":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.500987 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.534588 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5345607709999998 podStartE2EDuration="2.534560771s" podCreationTimestamp="2026-03-08 19:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:27.52480205 +0000 UTC m=+1368.920856113" watchObservedRunningTime="2026-03-08 19:54:27.534560771 +0000 UTC m=+1368.930614834" Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.513885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.514409 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.543704 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.231132469 podStartE2EDuration="5.543682967s" podCreationTimestamp="2026-03-08 19:54:23 +0000 UTC" firstStartedPulling="2026-03-08 19:54:24.378398891 +0000 UTC m=+1365.774452914" lastFinishedPulling="2026-03-08 19:54:27.690949379 +0000 UTC m=+1369.087003412" observedRunningTime="2026-03-08 19:54:28.533980257 +0000 UTC m=+1369.930034290" watchObservedRunningTime="2026-03-08 19:54:28.543682967 +0000 UTC m=+1369.939737000" Mar 08 19:54:35 crc kubenswrapper[4885]: I0308 19:54:35.880006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.471577 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.473565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.475600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.476815 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.505063 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.619597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.620776 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.624729 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.636707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637354 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637450 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.638831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.730619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.732189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.735838 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739159 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739248 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739291 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.747794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.754082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.755235 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.756138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.756662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.763324 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.774450 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.775531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.792101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.824064 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.844823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845444 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845650 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.846127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.846218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.852028 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.853628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.865020 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.866413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.870134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.870953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.883028 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.888067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.888133 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.918693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.939634 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948897 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.950946 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.952878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.958436 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.959474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.964437 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.969292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.972050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.982967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.018414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.060983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061129 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061417 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.062474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.065657 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.068539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.070724 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.085769 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.089210 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.112766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.153361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.389666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.670661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.678260 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.891420 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: W0308 19:54:37.901335 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515ba29a_53ae_41dc_a444_9ffe060dc61f.slice/crio-93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7 WatchSource:0}: Error finding container 93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7: Status 404 returned error can't find the container with id 93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7 Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.901450 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.998359 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.999565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.001648 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.006093 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.033999 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.057184 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.065230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:38 crc kubenswrapper[4885]: W0308 19:54:38.070298 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f43bb9c_a447_4ef0_8cdd_4447d8703193.slice/crio-8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424 WatchSource:0}: Error finding container 8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424: Status 404 returned error can't find the container with id 8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424 Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.095866 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.095997 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.096091 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.096119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201868 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.214476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.318462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.647680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerStarted","Data":"6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.648044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerStarted","Data":"251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.661594 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerStarted","Data":"93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.663395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.670837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"44a4da7dfb75724690fd74611aac2ae815bfdeff07d9af1d06d6b4bf6e5274e1"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672147 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pmgdm" podStartSLOduration=2.672128622 podStartE2EDuration="2.672128622s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:38.666278726 +0000 UTC m=+1380.062332759" watchObservedRunningTime="2026-03-08 19:54:38.672128622 +0000 UTC m=+1380.068182645" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672657 4885 generic.go:334] "Generic (PLEG): container finished" podID="0274624f-a49d-425f-b025-753e4e174477" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" exitCode=0 Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerStarted","Data":"e0f14b4c223b387a85388f249c4bcd0b49cb89c791c76c20d6484eb655e195a3"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.682169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerStarted","Data":"28d9f2d30308ef7519a870cfda7992bd0f04e5904fdb1fac5d4c1b4a105e7de5"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.836422 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:38 crc kubenswrapper[4885]: W0308 19:54:38.858235 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ea65f9e_cbf1_47a6_8800_aa6b7fe9ffef.slice/crio-3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa WatchSource:0}: Error finding container 3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa: Status 404 returned error can't find the container with id 3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.694811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerStarted","Data":"72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.695210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerStarted","Data":"3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.701366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerStarted","Data":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.701436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.737091 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" podStartSLOduration=2.737071124 podStartE2EDuration="2.737071124s" podCreationTimestamp="2026-03-08 19:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:39.726269315 +0000 UTC m=+1381.122323338" watchObservedRunningTime="2026-03-08 19:54:39.737071124 +0000 UTC m=+1381.133125157" Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.753400 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" podStartSLOduration=3.753345729 podStartE2EDuration="3.753345729s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:39.750310908 +0000 UTC m=+1381.146364951" watchObservedRunningTime="2026-03-08 19:54:39.753345729 +0000 UTC m=+1381.149399752" Mar 08 19:54:40 crc kubenswrapper[4885]: I0308 19:54:40.785443 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:40 crc kubenswrapper[4885]: I0308 19:54:40.810245 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.766613 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.767534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.768468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerStarted","Data":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.771772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerStarted","Data":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.771898 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780669 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" containerID="cri-o://4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780899 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" containerID="cri-o://5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.799692 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.092796799 podStartE2EDuration="6.799641177s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.898909293 +0000 UTC m=+1379.294963316" lastFinishedPulling="2026-03-08 19:54:41.605753671 +0000 UTC m=+1383.001807694" observedRunningTime="2026-03-08 19:54:42.793429281 +0000 UTC m=+1384.189483314" watchObservedRunningTime="2026-03-08 19:54:42.799641177 +0000 UTC m=+1384.195695210" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.816729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.315089473 podStartE2EDuration="6.816705464s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:38.076196162 +0000 UTC m=+1379.472250185" lastFinishedPulling="2026-03-08 19:54:41.577812163 +0000 UTC m=+1382.973866176" observedRunningTime="2026-03-08 19:54:42.815408369 +0000 UTC m=+1384.211462432" watchObservedRunningTime="2026-03-08 19:54:42.816705464 +0000 UTC m=+1384.212759527" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.863017 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.190156985 podStartE2EDuration="6.862996973s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.904586235 +0000 UTC m=+1379.300640258" lastFinishedPulling="2026-03-08 19:54:41.577426223 +0000 UTC m=+1382.973480246" observedRunningTime="2026-03-08 19:54:42.855102963 +0000 UTC m=+1384.251156996" watchObservedRunningTime="2026-03-08 19:54:42.862996973 +0000 UTC m=+1384.259051006" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.867885 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9671046629999998 podStartE2EDuration="6.867874525s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.676433645 +0000 UTC m=+1379.072487668" lastFinishedPulling="2026-03-08 19:54:41.577203497 +0000 UTC m=+1382.973257530" observedRunningTime="2026-03-08 19:54:42.838493358 +0000 UTC m=+1384.234547391" watchObservedRunningTime="2026-03-08 19:54:42.867874525 +0000 UTC m=+1384.263928548" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.356307 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528162 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528691 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs" (OuterVolumeSpecName: "logs") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.534349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx" (OuterVolumeSpecName: "kube-api-access-xgvhx") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "kube-api-access-xgvhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.569151 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data" (OuterVolumeSpecName: "config-data") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.575624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630983 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630998 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.631006 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790561 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" exitCode=0 Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790599 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" exitCode=143 Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790762 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.828989 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.845554 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.858886 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.859299 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859318 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.859332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859338 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859505 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859525 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.860527 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.860723 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.863189 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.868058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.872712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.916622 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.917432 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.917468 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} err="failed to get container status \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.917525 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.918288 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.918340 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} err="failed to get container status \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.918408 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.919026 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} err="failed to get container status \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.919077 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.920298 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} err="failed to get container status \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142101 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.145967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.147611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.159303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.166805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.202848 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.733884 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.807655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"44b459aaa1e4b5454e9eaf60660b8259d3e8c050ced5809387d49579fc3941c7"} Mar 08 19:54:45 crc kubenswrapper[4885]: E0308 19:54:45.248074 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72fa5124_24e9_47b1_8522_815cfef2a86b.slice/crio-conmon-6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.381174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" path="/var/lib/kubelet/pods/1f43bb9c-a447-4ef0-8cdd-4447d8703193/volumes" Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.824309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.824463 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.827115 4885 generic.go:334] "Generic (PLEG): container finished" podID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerID="6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d" exitCode=0 Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.827232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerDied","Data":"6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.858864 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.85883683 podStartE2EDuration="2.85883683s" podCreationTimestamp="2026-03-08 19:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:45.840976631 +0000 UTC m=+1387.237030654" watchObservedRunningTime="2026-03-08 19:54:45.85883683 +0000 UTC m=+1387.254890893" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.844843 4885 generic.go:334] "Generic (PLEG): container finished" podID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerID="72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0" exitCode=0 Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.844961 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerDied","Data":"72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0"} Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.941212 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.941292 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.990796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.023138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.023200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.114228 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.327392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.392177 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.447319 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.449878 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" containerID="cri-o://88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" gracePeriod=10 Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516604 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516994 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.523161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts" (OuterVolumeSpecName: "scripts") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.524013 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt" (OuterVolumeSpecName: "kube-api-access-jjvvt") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "kube-api-access-jjvvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.544516 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.557822 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data" (OuterVolumeSpecName: "config-data") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622842 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622945 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622956 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622964 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.663624 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: connect: connection refused" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerDied","Data":"251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e"} Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900330 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900424 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.912186 4885 generic.go:334] "Generic (PLEG): container finished" podID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerID="88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" exitCode=0 Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.912441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063"} Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.941989 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.980572 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.029985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030086 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.074248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd" (OuterVolumeSpecName: "kube-api-access-9r6vd") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "kube-api-access-9r6vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110392 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110400 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110864 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110904 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" containerID="cri-o://1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.111038 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" containerID="cri-o://52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.134106 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.146580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config" (OuterVolumeSpecName: "config") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.176275 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.176499 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" containerID="cri-o://0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.177000 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" containerID="cri-o://524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.235184 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.237280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.237404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.268974 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.299452 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341098 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341139 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341154 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341167 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.427660 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441531 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.448154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k" (OuterVolumeSpecName: "kube-api-access-pc57k") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "kube-api-access-pc57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.452050 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts" (OuterVolumeSpecName: "scripts") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.479562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.490417 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data" (OuterVolumeSpecName: "config-data") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543439 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543476 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543487 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543497 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.583528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.951035 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f00be59-96a4-4f2f-b319-6435aa008932" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" exitCode=143 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.951301 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954104 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954544 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954580 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954587 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954599 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="init" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954605 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="init" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954622 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954790 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.955445 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962388 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"4547972efa3892226729dccf70e00d854a9c1e79c44132cd7c28be08c974628a"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962432 4885 scope.go:117] "RemoveContainer" containerID="88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962561 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.976816 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982285 4885 generic.go:334] "Generic (PLEG): container finished" podID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerID="524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" exitCode=0 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982316 4885 generic.go:334] "Generic (PLEG): container finished" podID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerID="0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" exitCode=143 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982381 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.987593 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.994650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerDied","Data":"3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.994677 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.013883 4885 scope.go:117] "RemoveContainer" containerID="41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.031966 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.039366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.039551 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053568 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054083 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054164 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054542 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.058414 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs" (OuterVolumeSpecName: "logs") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.059118 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg" (OuterVolumeSpecName: "kube-api-access-2mthg") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "kube-api-access-2mthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.088763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.090412 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data" (OuterVolumeSpecName: "config-data") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.114936 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156116 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156473 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156646 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156723 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156791 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156857 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156914 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.159298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.159746 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.170170 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.286061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.386847 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" path="/var/lib/kubelet/pods/d087e374-bcc9-4a44-8fbe-aee43a47115e/volumes" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.802389 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.998209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerStarted","Data":"c280fe9fe13ec4f9da8c09591afb05298340012fc23ed25819dbc3970dce1bc0"} Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"44b459aaa1e4b5454e9eaf60660b8259d3e8c050ced5809387d49579fc3941c7"} Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000610 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000857 4885 scope.go:117] "RemoveContainer" containerID="524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.001257 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" containerID="cri-o://eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" gracePeriod=30 Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.038959 4885 scope.go:117] "RemoveContainer" containerID="0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.040131 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.072684 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093039 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: E0308 19:54:50.093784 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093808 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: E0308 19:54:50.093845 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093855 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.094264 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.094300 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.095737 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.099723 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.099765 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.107711 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.278809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.278882 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279028 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279065 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381454 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.382798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.388521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.402138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.405004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.418128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.717276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.007239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.014540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerStarted","Data":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.014719 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:51 crc kubenswrapper[4885]: W0308 19:54:51.016124 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda54375d5_43ad_493f_87b8_f10b9d6f68f9.slice/crio-72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d WatchSource:0}: Error finding container 72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d: Status 404 returned error can't find the container with id 72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.046184 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.04616695 podStartE2EDuration="3.04616695s" podCreationTimestamp="2026-03-08 19:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:51.033025107 +0000 UTC m=+1392.429079130" watchObservedRunningTime="2026-03-08 19:54:51.04616695 +0000 UTC m=+1392.442220973" Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.389345 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" path="/var/lib/kubelet/pods/7db4ab0b-c8b4-4809-8ff5-31f5175de78d/volumes" Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.943994 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.946243 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.947992 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.948076 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.054307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.05428963 podStartE2EDuration="2.05428963s" podCreationTimestamp="2026-03-08 19:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:52.051826593 +0000 UTC m=+1393.447880616" watchObservedRunningTime="2026-03-08 19:54:52.05428963 +0000 UTC m=+1393.450343653" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.559577 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664360 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.672284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh" (OuterVolumeSpecName: "kube-api-access-zvzmh") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "kube-api-access-zvzmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.699646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data" (OuterVolumeSpecName: "config-data") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.711626 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766097 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766129 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766140 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.891287 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.901229 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053073 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f00be59-96a4-4f2f-b319-6435aa008932" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" exitCode=0 Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.054098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"44a4da7dfb75724690fd74611aac2ae815bfdeff07d9af1d06d6b4bf6e5274e1"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.054132 4885 scope.go:117] "RemoveContainer" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056348 4885 generic.go:334] "Generic (PLEG): container finished" podID="699e9586-4fcb-4a93-b479-44d269162645" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" exitCode=0 Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerDied","Data":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerDied","Data":"28d9f2d30308ef7519a870cfda7992bd0f04e5904fdb1fac5d4c1b4a105e7de5"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056435 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.073986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.074139 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.074184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.075186 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.075569 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs" (OuterVolumeSpecName: "logs") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.076671 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.076847 4885 scope.go:117] "RemoveContainer" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.084620 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj" (OuterVolumeSpecName: "kube-api-access-z9qpj") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "kube-api-access-z9qpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.095286 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.107834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.115442 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.120482 4885 scope.go:117] "RemoveContainer" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.121032 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": container with ID starting with 52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d not found: ID does not exist" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121077 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} err="failed to get container status \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": rpc error: code = NotFound desc = could not find container \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": container with ID starting with 52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121109 4885 scope.go:117] "RemoveContainer" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.121358 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": container with ID starting with 1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff not found: ID does not exist" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121381 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} err="failed to get container status \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": rpc error: code = NotFound desc = could not find container \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": container with ID starting with 1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121407 4885 scope.go:117] "RemoveContainer" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.123818 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124383 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124402 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124460 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124470 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124496 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124723 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124739 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124758 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.125598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.127719 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.133407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.152538 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data" (OuterVolumeSpecName: "config-data") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178197 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178229 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178259 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.226554 4885 scope.go:117] "RemoveContainer" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.227791 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": container with ID starting with eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922 not found: ID does not exist" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.227831 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} err="failed to get container status \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": rpc error: code = NotFound desc = could not find container \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": container with ID starting with eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922 not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.381678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.381814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.382613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.383784 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.385891 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.386240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.397363 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.408308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.428526 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.429897 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.434586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.446832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.523611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.688435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.693196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.696208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.704314 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.812576 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.981892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: W0308 19:54:54.984310 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0768dc6_7cf7_4bd7_a1de_6a68f604de14.slice/crio-0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108 WatchSource:0}: Error finding container 0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108: Status 404 returned error can't find the container with id 0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108 Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.072076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerStarted","Data":"0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108"} Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.075312 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:55 crc kubenswrapper[4885]: W0308 19:54:55.087593 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod273a661b_19fb_47d2_b1d6_05ddf548f212.slice/crio-8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3 WatchSource:0}: Error finding container 8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3: Status 404 returned error can't find the container with id 8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3 Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.381181 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" path="/var/lib/kubelet/pods/5f00be59-96a4-4f2f-b319-6435aa008932/volumes" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.382179 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699e9586-4fcb-4a93-b479-44d269162645" path="/var/lib/kubelet/pods/699e9586-4fcb-4a93-b479-44d269162645/volumes" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.717778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.718075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.020735 4885 scope.go:117] "RemoveContainer" containerID="027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.085735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerStarted","Data":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.119704 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.11967893 podStartE2EDuration="2.11967893s" podCreationTimestamp="2026-03-08 19:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:56.108203363 +0000 UTC m=+1397.504257396" watchObservedRunningTime="2026-03-08 19:54:56.11967893 +0000 UTC m=+1397.515732993" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.139947 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.139929253 podStartE2EDuration="2.139929253s" podCreationTimestamp="2026-03-08 19:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:56.129327859 +0000 UTC m=+1397.525381882" watchObservedRunningTime="2026-03-08 19:54:56.139929253 +0000 UTC m=+1397.535983276" Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.462394 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.462877 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" containerID="cri-o://fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" gracePeriod=30 Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.944240 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.046804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"50f2f07f-efc4-4778-944c-d4819f0b0e30\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.059116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7" (OuterVolumeSpecName: "kube-api-access-cxdn7") pod "50f2f07f-efc4-4778-944c-d4819f0b0e30" (UID: "50f2f07f-efc4-4778-944c-d4819f0b0e30"). InnerVolumeSpecName "kube-api-access-cxdn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109052 4885 generic.go:334] "Generic (PLEG): container finished" podID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" exitCode=2 Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerDied","Data":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerDied","Data":"3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29"} Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109180 4885 scope.go:117] "RemoveContainer" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.131575 4885 scope.go:117] "RemoveContainer" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: E0308 19:54:58.132052 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": container with ID starting with fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce not found: ID does not exist" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.132088 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} err="failed to get container status \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": rpc error: code = NotFound desc = could not find container \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": container with ID starting with fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce not found: ID does not exist" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.152037 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.161025 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.180646 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.193392 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: E0308 19:54:58.193779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.193797 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.194009 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.194706 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.196702 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.197051 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.205334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457693 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457783 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.464729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.464830 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.467845 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.479399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.522141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.032360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.135770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerStarted","Data":"c766b62f5cceadf0886905919f92953b9185d1630ad51bf23b383898036558fd"} Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.185953 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186276 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" containerID="cri-o://608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186435 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" containerID="cri-o://955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186511 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" containerID="cri-o://9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186571 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" containerID="cri-o://640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.332560 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.378839 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" path="/var/lib/kubelet/pods/50f2f07f-efc4-4778-944c-d4819f0b0e30/volumes" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.528049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.147713 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerStarted","Data":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.148114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151896 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" exitCode=0 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151979 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" exitCode=2 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151993 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" exitCode=0 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152086 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.173492 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.792448626 podStartE2EDuration="2.173461071s" podCreationTimestamp="2026-03-08 19:54:58 +0000 UTC" firstStartedPulling="2026-03-08 19:54:59.034330412 +0000 UTC m=+1400.430384435" lastFinishedPulling="2026-03-08 19:54:59.415342847 +0000 UTC m=+1400.811396880" observedRunningTime="2026-03-08 19:55:00.166242018 +0000 UTC m=+1401.562296061" watchObservedRunningTime="2026-03-08 19:55:00.173461071 +0000 UTC m=+1401.569515104" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.717951 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.718355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.729411 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.738812 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.738951 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.836322 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.836678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.843092 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts" (OuterVolumeSpecName: "scripts") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.844902 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99" (OuterVolumeSpecName: "kube-api-access-l2s99") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "kube-api-access-l2s99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.863541 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938344 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938368 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938392 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.944364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.005147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data" (OuterVolumeSpecName: "config-data") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.040264 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.040300 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170487 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" exitCode=0 Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170567 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"031cf273d2ce6f416df1db118df4e04e2598121142c98562d1d0691ec1ae6950"} Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170585 4885 scope.go:117] "RemoveContainer" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170742 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.207681 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.211316 4885 scope.go:117] "RemoveContainer" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.228073 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.243417 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.243878 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.243984 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244006 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244077 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244091 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244099 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244330 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244359 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244369 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244388 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.247072 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.249571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.249737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.250107 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.252219 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.267507 4885 scope.go:117] "RemoveContainer" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.303498 4885 scope.go:117] "RemoveContainer" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.326114 4885 scope.go:117] "RemoveContainer" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.327249 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": container with ID starting with 955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853 not found: ID does not exist" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.327296 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} err="failed to get container status \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": rpc error: code = NotFound desc = could not find container \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": container with ID starting with 955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.327329 4885 scope.go:117] "RemoveContainer" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.328386 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": container with ID starting with 9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1 not found: ID does not exist" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.328483 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} err="failed to get container status \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": rpc error: code = NotFound desc = could not find container \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": container with ID starting with 9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.328503 4885 scope.go:117] "RemoveContainer" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.329808 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": container with ID starting with 640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055 not found: ID does not exist" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.329862 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} err="failed to get container status \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": rpc error: code = NotFound desc = could not find container \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": container with ID starting with 640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.329893 4885 scope.go:117] "RemoveContainer" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.330572 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": container with ID starting with 608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa not found: ID does not exist" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.330609 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} err="failed to get container status \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": rpc error: code = NotFound desc = could not find container \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": container with ID starting with 608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345139 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345346 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.446685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447213 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447251 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.450078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.452682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.454459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.454797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.456733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.463201 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.467519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.567018 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:03 crc kubenswrapper[4885]: I0308 19:55:03.386409 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" path="/var/lib/kubelet/pods/d0b0d3fd-e485-4924-8a1b-6214b7840e52/volumes" Mar 08 19:55:03 crc kubenswrapper[4885]: I0308 19:55:03.624567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.193597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"3d7681c02db53e0bc7cd05053b145730f9d4263a90b7db83dc6a7abc4e56e119"} Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.524164 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.572299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.814016 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.814089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.211325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.211391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.259231 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.895082 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.895130 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:06 crc kubenswrapper[4885]: I0308 19:55:06.219846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.244884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.247111 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.291031 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.331236207 podStartE2EDuration="6.291002149s" podCreationTimestamp="2026-03-08 19:55:02 +0000 UTC" firstStartedPulling="2026-03-08 19:55:03.63909636 +0000 UTC m=+1405.035150413" lastFinishedPulling="2026-03-08 19:55:07.598862302 +0000 UTC m=+1408.994916355" observedRunningTime="2026-03-08 19:55:08.279764088 +0000 UTC m=+1409.675818151" watchObservedRunningTime="2026-03-08 19:55:08.291002149 +0000 UTC m=+1409.687056212" Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.539522 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.723000 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.727854 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.732106 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:11 crc kubenswrapper[4885]: I0308 19:55:11.291867 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.290125 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330463 4885 generic.go:334] "Generic (PLEG): container finished" podID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" exitCode=137 Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerDied","Data":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerDied","Data":"93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7"} Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330609 4885 scope.go:117] "RemoveContainer" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.374306 4885 scope.go:117] "RemoveContainer" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: E0308 19:55:13.375120 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": container with ID starting with 4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e not found: ID does not exist" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.375173 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} err="failed to get container status \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": rpc error: code = NotFound desc = could not find container \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": container with ID starting with 4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e not found: ID does not exist" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387735 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.402002 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq" (OuterVolumeSpecName: "kube-api-access-zcznq") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "kube-api-access-zcznq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.416657 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data" (OuterVolumeSpecName: "config-data") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.425058 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490150 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490201 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490222 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.677673 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.690528 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706219 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: E0308 19:55:13.706674 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706695 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706941 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.707843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.711510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.711880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.712115 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.740539 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.798641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.798864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.900944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.900994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901128 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.905857 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.907281 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.907723 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.912392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.919431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.028947 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.613961 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:14 crc kubenswrapper[4885]: W0308 19:55:14.622630 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fb4d53_4722_4f72_9f1a_99ee2b637f6e.slice/crio-36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a WatchSource:0}: Error finding container 36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a: Status 404 returned error can't find the container with id 36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.817659 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.818497 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.820495 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.822828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerStarted","Data":"32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82"} Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerStarted","Data":"36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a"} Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358350 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.361405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.381826 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" path="/var/lib/kubelet/pods/515ba29a-53ae-41dc-a444-9ffe060dc61f/volumes" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.397720 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.397705003 podStartE2EDuration="2.397705003s" podCreationTimestamp="2026-03-08 19:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:15.388504656 +0000 UTC m=+1416.784558689" watchObservedRunningTime="2026-03-08 19:55:15.397705003 +0000 UTC m=+1416.793759026" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.548805 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.550949 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.619771 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648678 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648709 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648757 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751944 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.752049 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751981 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.769873 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.882888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:16 crc kubenswrapper[4885]: I0308 19:55:16.380968 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:16 crc kubenswrapper[4885]: W0308 19:55:16.384950 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8185583f_0ca5_46b1_a1ed_77c35b13a07b.slice/crio-faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4 WatchSource:0}: Error finding container faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4: Status 404 returned error can't find the container with id faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.268718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269290 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" containerID="cri-o://4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269400 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" containerID="cri-o://5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269396 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" containerID="cri-o://d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269556 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" containerID="cri-o://f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.280932 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.203:3000/\": read tcp 10.217.0.2:41260->10.217.0.203:3000: read: connection reset by peer" Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.375992 4885 generic.go:334] "Generic (PLEG): container finished" podID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerID="4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4" exitCode=0 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.390206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4"} Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.390273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerStarted","Data":"faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.383903 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.393528 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.400492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerStarted","Data":"8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.401648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404276 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404310 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" exitCode=2 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404322 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404332 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404510 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" containerID="cri-o://7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" gracePeriod=30 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404813 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405443 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"3d7681c02db53e0bc7cd05053b145730f9d4263a90b7db83dc6a7abc4e56e119"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405528 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405721 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" containerID="cri-o://a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" gracePeriod=30 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.451332 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" podStartSLOduration=3.451316817 podStartE2EDuration="3.451316817s" podCreationTimestamp="2026-03-08 19:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:18.446802096 +0000 UTC m=+1419.842856129" watchObservedRunningTime="2026-03-08 19:55:18.451316817 +0000 UTC m=+1419.847370840" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.457785 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.496428 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.514956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515192 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515235 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515302 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515394 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.519117 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.519782 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.527021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts" (OuterVolumeSpecName: "scripts") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.528032 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.537071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k" (OuterVolumeSpecName: "kube-api-access-jpp4k") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "kube-api-access-jpp4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.550062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.570264 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.589317 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620024 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620269 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620279 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620289 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620297 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620305 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620313 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.636489 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.638260 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.638301 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.638325 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.639377 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639411 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639427 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.639835 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639859 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639873 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.640143 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640187 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640215 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640490 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640514 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640949 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640968 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641163 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641180 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641356 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641373 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641550 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641568 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641768 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641787 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642007 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642023 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642232 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642250 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642419 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642435 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642649 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642666 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642826 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642842 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.643142 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.653062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data" (OuterVolumeSpecName: "config-data") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.721633 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.743722 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.751792 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760509 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760898 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760937 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760982 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760988 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760999 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761160 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761168 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761181 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761191 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.763248 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.768144 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.768319 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.769023 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.776978 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823351 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.925734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927141 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927845 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.933001 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.934143 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.938655 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.939890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.940095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.944545 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.029259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.081615 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.377086 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" path="/var/lib/kubelet/pods/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6/volumes" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.565703 4885 generic.go:334] "Generic (PLEG): container finished" podID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" exitCode=143 Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.565811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.624517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:20 crc kubenswrapper[4885]: I0308 19:55:20.576364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087"} Mar 08 19:55:20 crc kubenswrapper[4885]: I0308 19:55:20.576973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"cc7fd9651b0daebb9ca4a98f8eaeb673920ad2e9d2e71ad709ac41621e099e5b"} Mar 08 19:55:21 crc kubenswrapper[4885]: I0308 19:55:21.014557 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:21 crc kubenswrapper[4885]: I0308 19:55:21.594556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.368086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518066 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518232 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518344 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.523263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs" (OuterVolumeSpecName: "logs") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.526910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t" (OuterVolumeSpecName: "kube-api-access-n8p8t") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "kube-api-access-n8p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.545814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data" (OuterVolumeSpecName: "config-data") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.548399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610627 4885 generic.go:334] "Generic (PLEG): container finished" podID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" exitCode=0 Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610706 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610776 4885 scope.go:117] "RemoveContainer" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.619165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620629 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620650 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620659 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620779 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.652193 4885 scope.go:117] "RemoveContainer" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.655509 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.676721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692509 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.692880 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692897 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.692931 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692938 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693115 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693132 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693996 4885 scope.go:117] "RemoveContainer" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.694079 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.695502 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": container with ID starting with a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7 not found: ID does not exist" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.695538 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} err="failed to get container status \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": rpc error: code = NotFound desc = could not find container \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": container with ID starting with a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7 not found: ID does not exist" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.695563 4885 scope.go:117] "RemoveContainer" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699197 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.699194 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": container with ID starting with 7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582 not found: ID does not exist" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699364 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} err="failed to get container status \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": rpc error: code = NotFound desc = could not find container \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": container with ID starting with 7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582 not found: ID does not exist" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699426 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699943 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.710825 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.823829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926180 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926237 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.927571 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.931449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.938391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.940500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.942129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.945313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.016171 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.380933 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" path="/var/lib/kubelet/pods/273a661b-19fb-47d2-b1d6-05ddf548f212/volumes" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.498973 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.639825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"30344ac6a0d4ba0d54022fa56e76ff23a3c0282a327f06f069842f7bf31a4249"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.029374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.050976 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.654867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.655468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" containerID="cri-o://0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658721 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" containerID="cri-o://28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658729 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" containerID="cri-o://8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658755 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" containerID="cri-o://4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.687403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.696942 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.696905269 podStartE2EDuration="2.696905269s" podCreationTimestamp="2026-03-08 19:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:24.681106066 +0000 UTC m=+1426.077160149" watchObservedRunningTime="2026-03-08 19:55:24.696905269 +0000 UTC m=+1426.092959302" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.720146 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6344976879999997 podStartE2EDuration="6.720130191s" podCreationTimestamp="2026-03-08 19:55:18 +0000 UTC" firstStartedPulling="2026-03-08 19:55:19.640970199 +0000 UTC m=+1421.037024222" lastFinishedPulling="2026-03-08 19:55:23.726602692 +0000 UTC m=+1425.122656725" observedRunningTime="2026-03-08 19:55:24.71748985 +0000 UTC m=+1426.113543883" watchObservedRunningTime="2026-03-08 19:55:24.720130191 +0000 UTC m=+1426.116184224" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.910753 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.911995 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.913903 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.914221 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.919293 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.100123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.101953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.102414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.117480 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.228577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.720993 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.721055 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.722014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.721959 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" exitCode=2 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723534 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723602 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.753933 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:25 crc kubenswrapper[4885]: W0308 19:55:25.757712 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53478e7f_ae6d_4540_a51d_2fd03f142027.slice/crio-4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6 WatchSource:0}: Error finding container 4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6: Status 404 returned error can't find the container with id 4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.885032 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.951463 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.951693 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" containerID="cri-o://65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" gracePeriod=10 Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.031208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120277 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120327 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120475 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120805 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121247 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121266 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.125879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd" (OuterVolumeSpecName: "kube-api-access-4hqdd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "kube-api-access-4hqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.133149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts" (OuterVolumeSpecName: "scripts") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.177225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.189427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.206862 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225737 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225761 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225771 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225782 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225790 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.257005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data" (OuterVolumeSpecName: "config-data") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.327885 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.499798 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.632955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633058 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633148 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633225 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.639254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b" (OuterVolumeSpecName: "kube-api-access-frh8b") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "kube-api-access-frh8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.686529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.695294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.695636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.700722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.710694 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config" (OuterVolumeSpecName: "config") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736682 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736718 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736732 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736743 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736751 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736759 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.743234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerStarted","Data":"52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.743282 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerStarted","Data":"4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.746988 4885 generic.go:334] "Generic (PLEG): container finished" podID="0274624f-a49d-425f-b025-753e4e174477" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" exitCode=0 Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"e0f14b4c223b387a85388f249c4bcd0b49cb89c791c76c20d6484eb655e195a3"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747246 4885 scope.go:117] "RemoveContainer" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.751465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"cc7fd9651b0daebb9ca4a98f8eaeb673920ad2e9d2e71ad709ac41621e099e5b"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.751538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.777889 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-l6hg9" podStartSLOduration=2.777871513 podStartE2EDuration="2.777871513s" podCreationTimestamp="2026-03-08 19:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:26.759006387 +0000 UTC m=+1428.155060430" watchObservedRunningTime="2026-03-08 19:55:26.777871513 +0000 UTC m=+1428.173925536" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.801631 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.804260 4885 scope.go:117] "RemoveContainer" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.813188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.834910 4885 scope.go:117] "RemoveContainer" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.835384 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": container with ID starting with 65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db not found: ID does not exist" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.835486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} err="failed to get container status \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": rpc error: code = NotFound desc = could not find container \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": container with ID starting with 65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db not found: ID does not exist" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.835565 4885 scope.go:117] "RemoveContainer" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.835893 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": container with ID starting with bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7 not found: ID does not exist" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.836077 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7"} err="failed to get container status \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": rpc error: code = NotFound desc = could not find container \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": container with ID starting with bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7 not found: ID does not exist" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.836143 4885 scope.go:117] "RemoveContainer" containerID="4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.842056 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.856860 4885 scope.go:117] "RemoveContainer" containerID="8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.866376 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877349 4885 scope.go:117] "RemoveContainer" containerID="28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877454 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877799 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877830 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877848 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877865 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877870 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="init" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877892 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="init" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877904 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877909 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878118 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878135 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878143 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878153 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878161 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.879679 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.882428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.882571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.883030 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.885616 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.914203 4885 scope.go:117] "RemoveContainer" containerID="0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.044845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.044963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045001 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045390 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148092 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148498 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.149494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.150035 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.155059 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.155564 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.156499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.158120 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.159553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.166863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.206297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.395250 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0274624f-a49d-425f-b025-753e4e174477" path="/var/lib/kubelet/pods/0274624f-a49d-425f-b025-753e4e174477/volumes" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.397000 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" path="/var/lib/kubelet/pods/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e/volumes" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.705371 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.763366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"a1d4f26c989a88dcb6b8292fefbf9776a8ae2f04c4981fe1a6564019613a69ba"} Mar 08 19:55:28 crc kubenswrapper[4885]: I0308 19:55:28.780047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7"} Mar 08 19:55:29 crc kubenswrapper[4885]: I0308 19:55:29.790162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db"} Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.804572 4885 generic.go:334] "Generic (PLEG): container finished" podID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerID="52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2" exitCode=0 Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.804642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerDied","Data":"52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2"} Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.809799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.298271 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.467948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468170 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.475877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts" (OuterVolumeSpecName: "scripts") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.476009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh" (OuterVolumeSpecName: "kube-api-access-rt7vh") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "kube-api-access-rt7vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.503492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data" (OuterVolumeSpecName: "config-data") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.504884 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571557 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571627 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571659 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571686 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.851453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.851566 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerDied","Data":"4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854434 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854458 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.905869 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.843370281 podStartE2EDuration="6.905846405s" podCreationTimestamp="2026-03-08 19:55:26 +0000 UTC" firstStartedPulling="2026-03-08 19:55:27.713510281 +0000 UTC m=+1429.109564304" lastFinishedPulling="2026-03-08 19:55:31.775986385 +0000 UTC m=+1433.172040428" observedRunningTime="2026-03-08 19:55:32.893829493 +0000 UTC m=+1434.289883536" watchObservedRunningTime="2026-03-08 19:55:32.905846405 +0000 UTC m=+1434.301900448" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.016858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.016913 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.024119 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.051448 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.051724 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" containerID="cri-o://9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.106461 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.106715 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" containerID="cri-o://aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.107202 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" containerID="cri-o://bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.866248 4885 generic.go:334] "Generic (PLEG): container finished" podID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" exitCode=143 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.866336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.030116 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.030160 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.526654 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.528763 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.530122 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.530195 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.879445 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" containerID="cri-o://442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" gracePeriod=30 Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.879583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" containerID="cri-o://e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" gracePeriod=30 Mar 08 19:55:35 crc kubenswrapper[4885]: I0308 19:55:35.896784 4885 generic.go:334] "Generic (PLEG): container finished" podID="006cf27f-ef31-4316-a5de-833141664964" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" exitCode=143 Mar 08 19:55:35 crc kubenswrapper[4885]: I0308 19:55:35.897041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.263895 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:59308->10.217.0.199:8775: read: connection reset by peer" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.263974 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:59300->10.217.0.199:8775: read: connection reset by peer" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.807390 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911575 4885 generic.go:334] "Generic (PLEG): container finished" podID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" exitCode=0 Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911679 4885 scope.go:117] "RemoveContainer" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911700 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.939737 4885 scope.go:117] "RemoveContainer" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960659 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.961383 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs" (OuterVolumeSpecName: "logs") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.961822 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.967301 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4" (OuterVolumeSpecName: "kube-api-access-ws8s4") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "kube-api-access-ws8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977180 4885 scope.go:117] "RemoveContainer" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: E0308 19:55:36.977830 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": container with ID starting with bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552 not found: ID does not exist" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977907 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} err="failed to get container status \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": rpc error: code = NotFound desc = could not find container \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": container with ID starting with bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552 not found: ID does not exist" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977973 4885 scope.go:117] "RemoveContainer" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: E0308 19:55:36.978670 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": container with ID starting with aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898 not found: ID does not exist" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.978709 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} err="failed to get container status \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": rpc error: code = NotFound desc = could not find container \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": container with ID starting with aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898 not found: ID does not exist" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.006677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data" (OuterVolumeSpecName: "config-data") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.016578 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063506 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063528 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063538 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063550 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.258600 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.273876 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.285675 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286153 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286170 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286185 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286217 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286224 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286385 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286395 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286418 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.287383 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.292159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.292329 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.316558 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.377697 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" path="/var/lib/kubelet/pods/a54375d5-43ad-493f-87b8-f10b9d6f68f9/volumes" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571805 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.572037 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.572434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.575015 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.576738 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.581867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.613211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.620941 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:38 crc kubenswrapper[4885]: W0308 19:55:38.158399 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdd926a8_442c_4f63_bb36_3e6a425436c2.slice/crio-94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc WatchSource:0}: Error finding container 94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc: Status 404 returned error can't find the container with id 94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.164053 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.529806 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696169 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696424 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696504 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.703023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz" (OuterVolumeSpecName: "kube-api-access-9lzkz") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "kube-api-access-9lzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.742006 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data" (OuterVolumeSpecName: "config-data") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.744609 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799388 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799454 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799483 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955463 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" exitCode=0 Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerDied","Data":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955565 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955614 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerDied","Data":"0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955699 4885 scope.go:117] "RemoveContainer" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.988039 4885 scope.go:117] "RemoveContainer" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: E0308 19:55:38.991017 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": container with ID starting with 9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35 not found: ID does not exist" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.991083 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} err="failed to get container status \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": rpc error: code = NotFound desc = could not find container \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": container with ID starting with 9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35 not found: ID does not exist" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.992583 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9925451920000001 podStartE2EDuration="1.992545192s" podCreationTimestamp="2026-03-08 19:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:38.982953065 +0000 UTC m=+1440.379007088" watchObservedRunningTime="2026-03-08 19:55:38.992545192 +0000 UTC m=+1440.388599215" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.016873 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.034625 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.063291 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: E0308 19:55:39.064212 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.064237 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.064521 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.065363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.068755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.077221 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.315622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.315885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.316162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.323777 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.323886 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.338003 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.385333 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.404407 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" path="/var/lib/kubelet/pods/d0768dc6-7cf7-4bd7-a1de-6a68f604de14/volumes" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.782384 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:39 crc kubenswrapper[4885]: W0308 19:55:39.911992 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945717bc_405f_4628_934c_66e4500f56f0.slice/crio-4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28 WatchSource:0}: Error finding container 4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28: Status 404 returned error can't find the container with id 4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28 Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.912873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933299 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933322 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.934625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs" (OuterVolumeSpecName: "logs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.937692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj" (OuterVolumeSpecName: "kube-api-access-5p2mj") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "kube-api-access-5p2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.979596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data" (OuterVolumeSpecName: "config-data") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.980518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerStarted","Data":"4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986262 4885 generic.go:334] "Generic (PLEG): container finished" podID="006cf27f-ef31-4316-a5de-833141664964" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" exitCode=0 Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986376 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986440 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986459 4885 scope.go:117] "RemoveContainer" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"30344ac6a0d4ba0d54022fa56e76ff23a3c0282a327f06f069842f7bf31a4249"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.991206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.014260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.019824 4885 scope.go:117] "RemoveContainer" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035649 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035674 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035683 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035691 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035701 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.039453 4885 scope.go:117] "RemoveContainer" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.039993 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": container with ID starting with e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359 not found: ID does not exist" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040049 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} err="failed to get container status \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": rpc error: code = NotFound desc = could not find container \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": container with ID starting with e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359 not found: ID does not exist" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040082 4885 scope.go:117] "RemoveContainer" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.040438 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": container with ID starting with 442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8 not found: ID does not exist" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040494 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} err="failed to get container status \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": rpc error: code = NotFound desc = could not find container \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": container with ID starting with 442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8 not found: ID does not exist" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.045473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.138303 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.346844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.372415 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.385694 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.386221 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386243 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.386275 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386283 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386493 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386519 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.387741 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.389151 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.390950 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.391180 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.396436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549319 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651200 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651276 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.652954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.656071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.658419 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.658506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.669557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.671320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.777709 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.004049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerStarted","Data":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.021986 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.021969945 podStartE2EDuration="2.021969945s" podCreationTimestamp="2026-03-08 19:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:41.017547427 +0000 UTC m=+1442.413601470" watchObservedRunningTime="2026-03-08 19:55:41.021969945 +0000 UTC m=+1442.418023968" Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.260444 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:41 crc kubenswrapper[4885]: W0308 19:55:41.264881 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda083a431_5afc_4289_a5cf_625bc619465e.slice/crio-5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076 WatchSource:0}: Error finding container 5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076: Status 404 returned error can't find the container with id 5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076 Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.385909 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006cf27f-ef31-4316-a5de-833141664964" path="/var/lib/kubelet/pods/006cf27f-ef31-4316-a5de-833141664964/volumes" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.044219 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044198473 podStartE2EDuration="2.044198473s" podCreationTimestamp="2026-03-08 19:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:42.038188252 +0000 UTC m=+1443.434242295" watchObservedRunningTime="2026-03-08 19:55:42.044198473 +0000 UTC m=+1443.440252506" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.621452 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.621559 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:55:44 crc kubenswrapper[4885]: I0308 19:55:44.385513 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:55:47 crc kubenswrapper[4885]: I0308 19:55:47.621842 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:47 crc kubenswrapper[4885]: I0308 19:55:47.623189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:48 crc kubenswrapper[4885]: I0308 19:55:48.641220 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:48 crc kubenswrapper[4885]: I0308 19:55:48.641257 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:49 crc kubenswrapper[4885]: I0308 19:55:49.387738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:55:49 crc kubenswrapper[4885]: I0308 19:55:49.434339 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.165844 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.778485 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.778526 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:51 crc kubenswrapper[4885]: I0308 19:55:51.790175 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:51 crc kubenswrapper[4885]: I0308 19:55:51.790216 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.223138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.636286 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.636756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.645410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.647555 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.163329 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.171775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.175573 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.176079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.179948 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.193307 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.299087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.401447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.425548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.511550 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.812022 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.813046 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.813093 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.825332 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.966238 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: W0308 19:56:00.967078 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9eec97_1b10_4d6a_9787_e137d3c37dec.slice/crio-45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f WatchSource:0}: Error finding container 45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f: Status 404 returned error can't find the container with id 45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.240983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerStarted","Data":"45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f"} Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.241592 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.254340 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:56:03 crc kubenswrapper[4885]: I0308 19:56:03.269642 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerID="5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49" exitCode=0 Mar 08 19:56:03 crc kubenswrapper[4885]: I0308 19:56:03.269724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerDied","Data":"5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49"} Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.623512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.786406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.795086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6" (OuterVolumeSpecName: "kube-api-access-bnkc6") pod "5d9eec97-1b10-4d6a-9787-e137d3c37dec" (UID: "5d9eec97-1b10-4d6a-9787-e137d3c37dec"). InnerVolumeSpecName "kube-api-access-bnkc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.889816 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.293894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerDied","Data":"45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f"} Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.293967 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.294004 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.712523 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.729600 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:56:07 crc kubenswrapper[4885]: I0308 19:56:07.388363 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" path="/var/lib/kubelet/pods/d8bff80c-e537-4de5-8a05-85ee81004c30/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.084781 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.085414 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" containerID="cri-o://6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" gracePeriod=2 Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.105420 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173077 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.173428 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173445 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.173485 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173491 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173649 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173672 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.174440 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.186839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.215741 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.248418 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.261050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.266866 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.273433 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.293470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.293614 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.301785 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.303045 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.315968 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.317467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.321458 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.321600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.333398 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.350997 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.395566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.395666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.396619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.405443 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" path="/var/lib/kubelet/pods/11e75774-c86c-459a-9c66-eaf3c43addac/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.419097 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" path="/var/lib/kubelet/pods/3b6d8115-d92a-4305-a2d2-8d9874a81390/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.419892 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" path="/var/lib/kubelet/pods/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.420448 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.420473 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.434387 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.452238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.473494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.488211 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.506270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520262 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641151 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.641301 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.641347 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:24.141331407 +0000 UTC m=+1485.537385430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.642267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.642844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.681376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.690151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.697438 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.710365 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.718460 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.732196 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.783545 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.786670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.808484 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.809963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.827721 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.833584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852561 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852778 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.854158 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.858412 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.886844 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.930278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.936054 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.941868 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958467 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958487 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.964148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.004736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.004800 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.005004 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-5qsh8" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" containerID="cri-o://e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.027949 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.062945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063211 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063310 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.067640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.068172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.098564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.119834 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.121494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.129363 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.165027 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.165092 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:25.165078034 +0000 UTC m=+1486.561132057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.181955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.186671 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.188170 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.193566 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.206060 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.226076 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.232813 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.254891 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.255420 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" containerID="cri-o://e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.255671 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" containerID="cri-o://619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.272775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.272899 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.287572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.307426 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.317849 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.326110 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.348033 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.363790 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.364026 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" containerID="cri-o://8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" gracePeriod=10 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.371225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.371292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.388088 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.429141 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.447050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.489194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.489366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.490492 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.490538 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:24.990523914 +0000 UTC m=+1486.386577937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.490707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.524822 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.560513 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.690523 4885 generic.go:334] "Generic (PLEG): container finished" podID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerID="8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" exitCode=0 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.690885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb"} Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.700178 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.794130 4885 generic.go:334] "Generic (PLEG): container finished" podID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" exitCode=2 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.794168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.826875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.855806 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.865516 4885 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-mn4lz" message=< Mar 08 19:56:24 crc kubenswrapper[4885]: Exiting ovn-controller (1) [ OK ] Mar 08 19:56:24 crc kubenswrapper[4885]: > Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.865556 4885 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" containerID="cri-o://40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.865588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" containerID="cri-o://40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.916189 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.997548 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.038974 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.040364 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.047748 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.047806 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:26.047791953 +0000 UTC m=+1487.443845976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.054027 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.123506 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.191253 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.215066 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.231821 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.250627 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.250694 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.250678631 +0000 UTC m=+1488.646732654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.259963 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.270790 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271240 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" containerID="cri-o://803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271337 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" containerID="cri-o://3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271307 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" containerID="cri-o://2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271359 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" containerID="cri-o://a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271397 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" containerID="cri-o://5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271492 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" containerID="cri-o://6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271530 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" containerID="cri-o://e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271553 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" containerID="cri-o://25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271388 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" containerID="cri-o://904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271618 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" containerID="cri-o://7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" containerID="cri-o://9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271652 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" containerID="cri-o://37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271677 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" containerID="cri-o://64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271705 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" containerID="cri-o://6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271777 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" containerID="cri-o://d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.322167 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.322815 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" containerID="cri-o://e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.374489 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell0" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell0" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.375727 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.435588 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f704685-800d-4386-a47d-8c60b0885aca" path="/var/lib/kubelet/pods/0f704685-800d-4386-a47d-8c60b0885aca/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.436326 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" path="/var/lib/kubelet/pods/2efc22fd-a92b-422c-876d-7b80f06928b2/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.450838 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" path="/var/lib/kubelet/pods/3dcb0d23-1927-4f70-ac45-bcc01f9a081a/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.451378 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" path="/var/lib/kubelet/pods/46290bd2-6ad7-46f4-86f4-48aa73bc304a/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.451869 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" path="/var/lib/kubelet/pods/53478e7f-ae6d-4540-a51d-2fd03f142027/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.456602 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" path="/var/lib/kubelet/pods/57032abe-6c4f-4711-9f48-5733d6a29ec3/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.457141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.457277 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618b5189-8b29-473f-b59c-e911fca71041" path="/var/lib/kubelet/pods/618b5189-8b29-473f-b59c-e911fca71041/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.458139 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" path="/var/lib/kubelet/pods/6fff4a7a-1b14-4e29-8c84-d7fc55de879c/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.459314 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" path="/var/lib/kubelet/pods/72fa5124-24e9-47b1-8522-815cfef2a86b/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.460063 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" path="/var/lib/kubelet/pods/85d25daf-f279-4be1-be4a-75e05e47923c/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.460747 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" path="/var/lib/kubelet/pods/915cd482-d3dc-42c1-96cc-0fcc18bbaff2/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.465245 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04611e7-17b5-48ae-8169-534f684a101b" path="/var/lib/kubelet/pods/b04611e7-17b5-48ae-8169-534f684a101b/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.479278 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" path="/var/lib/kubelet/pods/e4353f36-d8f9-41ff-8062-f874bd53ef12/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.479867 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.540508 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.540761 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" containerID="cri-o://2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.541138 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" containerID="cri-o://1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.545162 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.567663 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.576322 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.579239 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" containerID="cri-o://f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582859 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582941 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.611523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f" (OuterVolumeSpecName: "kube-api-access-j727f") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "kube-api-access-j727f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.622746 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.628750 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" containerID="cri-o://701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.630315 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" containerID="cri-o://4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.633226 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.709317 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.731862 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.735838 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" containerID="cri-o://4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.770200 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" containerID="cri-o://1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.796098 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.803335 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.816997 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819448 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819490 4885 generic.go:334] "Generic (PLEG): container finished" podID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerID="e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerDied","Data":"e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.823586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" event={"ID":"a3e6330b-4e2d-44ca-b9be-d36b2f613571","Type":"ContainerStarted","Data":"17603ca27d7ecde56e3c1a978d3f1cf7aefc5c4500d55699231533afe9deacd5"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.834665 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.837242 4885 generic.go:334] "Generic (PLEG): container finished" podID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerID="2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" exitCode=143 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.837302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3"} Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.841882 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842785 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842831 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842970 4885 generic.go:334] "Generic (PLEG): container finished" podID="a083a431-5afc-4289-a5cf-625bc619465e" containerID="701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" exitCode=143 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.843033 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f"} Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.843072 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.854258 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerID="6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" exitCode=137 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.869908 4885 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-pp4rs" message=< Mar 08 19:56:25 crc kubenswrapper[4885]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.871192 4885 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" containerID="cri-o://a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871822 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" containerID="cri-o://a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" gracePeriod=29 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871233 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.872477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.872672 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.873344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.873413 4885 scope.go:117] "RemoveContainer" containerID="8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.884887 4885 generic.go:334] "Generic (PLEG): container finished" podID="b8dd6448-dd16-4487-bc90-f835712effc1" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.884969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.889069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" containerID="cri-o://e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" gracePeriod=29 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.890098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.895147 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900305 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900567 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" containerID="cri-o://fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900692 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" containerID="cri-o://ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.902218 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.903408 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.903437 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904610 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904642 4885 generic.go:334] "Generic (PLEG): container finished" podID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.910097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" event={"ID":"2242ad5f-8a7e-4017-8441-6d05b2c94930","Type":"ContainerStarted","Data":"82162e312894c7024d5a8960ccd0eb1055e87a8eb4f79ab3a9a1cc489e5a6576"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.918566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.919191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config" (OuterVolumeSpecName: "config") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.922605 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.922666 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:26.422651823 +0000 UTC m=+1487.818705846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.922931 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923347 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923360 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923370 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923384 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.936037 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.942534 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.943461 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.950413 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell0" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell0" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.953154 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.973946 4885 scope.go:117] "RemoveContainer" containerID="4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.975826 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.990582 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-744484b5fc-g6mjz" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" containerID="cri-o://67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.990846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-744484b5fc-g6mjz" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" containerID="cri-o://52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.007971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.008252 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" containerID="cri-o://48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.008414 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" containerID="cri-o://7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.014285 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.014558 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" containerID="cri-o://8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.015102 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" containerID="cri-o://2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.024035 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.024224 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb5b9c587-nd8hp" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" containerID="cri-o://17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025275 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025305 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025395 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025416 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025619 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb5b9c587-nd8hp" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" containerID="cri-o://46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run" (OuterVolumeSpecName: "var-run") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026801 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.027824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config" (OuterVolumeSpecName: "config") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.028514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.046810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt" (OuterVolumeSpecName: "kube-api-access-l4tlt") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "kube-api-access-l4tlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060090 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060123 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060132 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060138 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060145 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060152 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060159 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060165 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060171 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060177 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060242 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060277 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060335 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.061150 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.062470 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bq8dk" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.064615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts" (OuterVolumeSpecName: "scripts") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.065730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.066391 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs" (OuterVolumeSpecName: "kube-api-access-lz9cs") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "kube-api-access-lz9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.067521 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" containerID="cri-o://db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071231 4885 generic.go:334] "Generic (PLEG): container finished" podID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerID="40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071282 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerDied","Data":"40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071312 4885 scope.go:117] "RemoveContainer" containerID="40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071460 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.073201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.080434 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "barbican" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="barbican" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.080694 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.083694 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-86ea-account-create-update-zb2lr" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.085798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.086043 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" containerID="cri-o://4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.086270 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" containerID="cri-o://41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.091039 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.098070 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.109343 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.112046 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130800 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131207 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131220 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131229 4885 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131239 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131247 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131254 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131264 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131272 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131280 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131289 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.131340 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.131381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.131367856 +0000 UTC m=+1489.527421879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.133599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9" (OuterVolumeSpecName: "kube-api-access-phmx9") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "kube-api-access-phmx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.135912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.147700 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162161 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162389 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" containerID="cri-o://2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162781 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" containerID="cri-o://f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.164560 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.198080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.213022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.214411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.220223 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.241903 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242125 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242135 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242144 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242254 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242264 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.246052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.259695 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.277628 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.286361 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.295010 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.311869 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" containerID="cri-o://ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" gracePeriod=604800 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.315508 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.324344 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.338217 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.338277 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.343806 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.349370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.355450 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.362672 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.370469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.376943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.390240 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.390453 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.397590 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.436680 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.436983 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445965 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.446047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.446465 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.446521 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.446508432 +0000 UTC m=+1488.842562455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.447416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.447900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config" (OuterVolumeSpecName: "config") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts" (OuterVolumeSpecName: "scripts") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449984 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" containerID="cri-o://3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.450731 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" containerID="cri-o://6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.474786 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.481580 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.482512 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" containerID="cri-o://d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.482846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" containerID="cri-o://b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.498122 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.499062 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b8dd6448-dd16-4487-bc90-f835712effc1/ovsdbserver-sb/0.log" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.499152 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.504658 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg" (OuterVolumeSpecName: "kube-api-access-chcvg") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "kube-api-access-chcvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.505170 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.505292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.509087 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.509154 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.544811 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "neutron" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="neutron" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.546604 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "cinder" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="cinder" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.547077 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-031a-account-create-update-ss6cl" podUID="36678078-1658-4edc-a256-3f0bb8d23ed8" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.548032 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-f33b-account-create-update-vbmkj" podUID="ce3508e5-7126-47b0-a598-da6515457cb7" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549913 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549960 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549969 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549987 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549997 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.550026 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.554765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.555261 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: W0308 19:56:26.561253 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ba6c20_150e_48ca_ac4a_4a6a8ef1f525.slice/crio-34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb WatchSource:0}: Error finding container 34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb: Status 404 returned error can't find the container with id 34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.573914 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.582458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.586772 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.591709 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "glance" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="glance" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.593719 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-a047-account-create-update-qrjwk" podUID="d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.597859 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.612529 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.612764 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" containerID="cri-o://d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.613185 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" containerID="cri-o://786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.624249 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.629766 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.629969 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" containerID="cri-o://aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.633067 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.639570 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663417 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663491 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663607 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666117 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666145 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666172 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.669637 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.675021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts" (OuterVolumeSpecName: "scripts") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.675074 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config" (OuterVolumeSpecName: "config") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.677308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872" (OuterVolumeSpecName: "kube-api-access-wk872") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "kube-api-access-wk872". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.678033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.688274 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" containerID="cri-o://c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" gracePeriod=604800 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.729191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.763899 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767547 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767583 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767593 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767603 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767633 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767642 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.775524 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.783972 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.787689 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.813562 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.815968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.818049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.824701 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.830969 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.839754 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.851545 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.864320 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869032 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869133 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869231 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089820 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089898 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerDied","Data":"35bbd23060f341a3429a5bab6384434330b103927103bf0acea28883cf67dc65"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089971 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.090009 4885 scope.go:117] "RemoveContainer" containerID="e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094393 4885 generic.go:334] "Generic (PLEG): container finished" podID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerID="52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094433 4885 generic.go:334] "Generic (PLEG): container finished" podID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerID="67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094487 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.100651 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a083cf5-4ca2-440c-840a-6b159151609f" containerID="d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.100696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.104136 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.104258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.107528 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerID="4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.107662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.109782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bq8dk" event={"ID":"09db13b9-d564-49c9-b383-5fbfe0e43c9b","Type":"ContainerStarted","Data":"8eb8421d6a99d9718a9f12940cde328117f97ad0e5bd5e92cde85c15eeb1502b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.110265 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-bq8dk" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.121705 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.123264 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bq8dk" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.125000 4885 generic.go:334] "Generic (PLEG): container finished" podID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerID="2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.125047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.137204 4885 generic.go:334] "Generic (PLEG): container finished" podID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.137273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.142771 4885 generic.go:334] "Generic (PLEG): container finished" podID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.142849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.154108 4885 generic.go:334] "Generic (PLEG): container finished" podID="a7268474-e124-4139-bf24-6b3f605b9511" containerID="3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.154222 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.164182 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.166512 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerID="46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.166579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.173596 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-qrjwk" event={"ID":"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525","Type":"ContainerStarted","Data":"34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.176391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-vbmkj" event={"ID":"ce3508e5-7126-47b0-a598-da6515457cb7","Type":"ContainerStarted","Data":"92b9ea85ebe6946aa9aeb6553980bbeaa2d164cb127da6a8cd1b41eb2565b1b7"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.180404 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.180955 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.181874 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.681858399 +0000 UTC m=+1489.077912422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.186361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-zb2lr" event={"ID":"619a568c-d0c3-408b-96c1-39a3a769d1ad","Type":"ContainerStarted","Data":"ecad256f4e94f24cfab1fe144e2c009ebd76a6803d21f421f8c4cfc3079aa401"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.190876 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerID="8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.190957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.191086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221374 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b8dd6448-dd16-4487-bc90-f835712effc1/ovsdbserver-sb/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221424 4885 generic.go:334] "Generic (PLEG): container finished" podID="b8dd6448-dd16-4487-bc90-f835712effc1" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221534 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"27094a5eecfea3bd81d2314594b8cfdb03f329abe60f33c847c8c969d4747a0d"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.254360 4885 generic.go:334] "Generic (PLEG): container finished" podID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.254434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281511 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281614 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.282369 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.282419 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.282405515 +0000 UTC m=+1492.678459538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.283444 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.286025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.286608 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-ss6cl" event={"ID":"36678078-1658-4edc-a256-3f0bb8d23ed8","Type":"ContainerStarted","Data":"81ac023e6b35f76413c8e20a66666b09686bcf823a6a24f94f4a77215fb4b2b2"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.288069 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.288866 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.304794 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336593 4885 generic.go:334] "Generic (PLEG): container finished" podID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerID="48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336686 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd" (OuterVolumeSpecName: "kube-api-access-8jqgd") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "kube-api-access-8jqgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336733 4885 scope.go:117] "RemoveContainer" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.337796 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.338165 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "barbican" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="barbican" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.339808 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-86ea-account-create-update-zb2lr" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.354732 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.364094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.379239 4885 scope.go:117] "RemoveContainer" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.382163 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.385378 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386342 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386383 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386395 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386408 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386430 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386455 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.389693 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.393431 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.399836 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" path="/var/lib/kubelet/pods/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.400400 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" path="/var/lib/kubelet/pods/1c223ffe-b12c-4c78-920a-66e6feb9178f/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.401055 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" path="/var/lib/kubelet/pods/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.402020 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" path="/var/lib/kubelet/pods/474d55a2-f4f0-4e46-809c-367a3110c33d/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.402497 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" path="/var/lib/kubelet/pods/4a7dd20b-387a-4061-ab5a-a53ee6a240ef/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.403002 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" path="/var/lib/kubelet/pods/5f0edc25-2cc1-4111-96e3-3807e6463d57/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.403510 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" path="/var/lib/kubelet/pods/761f5c93-2ed3-43f0-acaf-ee92d0719ec3/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.406359 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.409450 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" path="/var/lib/kubelet/pods/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.409958 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" path="/var/lib/kubelet/pods/8185583f-0ca5-46b1-a1ed-77c35b13a07b/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.410480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84470b78-5e74-473c-88d3-5343943c01fb" path="/var/lib/kubelet/pods/84470b78-5e74-473c-88d3-5343943c01fb/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.420062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" path="/var/lib/kubelet/pods/92191eaa-0c0a-4927-adf4-a4e386ed2552/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.420659 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" path="/var/lib/kubelet/pods/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.422626 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" path="/var/lib/kubelet/pods/b8dd6448-dd16-4487-bc90-f835712effc1/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.423201 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" path="/var/lib/kubelet/pods/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.423709 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" path="/var/lib/kubelet/pods/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.429466 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" path="/var/lib/kubelet/pods/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.442547 4885 scope.go:117] "RemoveContainer" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.463653 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": container with ID starting with f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8 not found: ID does not exist" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.463716 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} err="failed to get container status \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": rpc error: code = NotFound desc = could not find container \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": container with ID starting with f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8 not found: ID does not exist" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.463745 4885 scope.go:117] "RemoveContainer" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.466839 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": container with ID starting with 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 not found: ID does not exist" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.466870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} err="failed to get container status \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": rpc error: code = NotFound desc = could not find container \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": container with ID starting with 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 not found: ID does not exist" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.466887 4885 scope.go:117] "RemoveContainer" containerID="6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479188 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479241 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479248 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479255 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479305 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481107 4885 generic.go:334] "Generic (PLEG): container finished" podID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerID="32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerDied","Data":"32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481211 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.484691 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.484723 4885 generic.go:334] "Generic (PLEG): container finished" podID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.485240 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486561 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.489449 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.490991 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.495308 4885 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.496316 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.496425 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.49640977 +0000 UTC m=+1490.892463793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.524910 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.607806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.610748 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.611972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612086 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.616251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.617350 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.617374 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.637588 4885 scope.go:117] "RemoveContainer" containerID="32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.637745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.647218 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.647555 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" containerID="cri-o://a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650185 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" containerID="cri-o://7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650368 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" containerID="cri-o://c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650954 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" containerID="cri-o://46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.680437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.693253 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k" (OuterVolumeSpecName: "kube-api-access-jsh5k") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "kube-api-access-jsh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.711839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8" (OuterVolumeSpecName: "kube-api-access-qxpr8") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "kube-api-access-qxpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730249 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730290 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730300 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.730372 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.730423 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.730408286 +0000 UTC m=+1490.126462309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.791472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.800821 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.817041 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.817276 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" containerID="cri-o://72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.841406 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.841650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" containerID="cri-o://f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.857399 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.861094 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.867597 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.875584 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.880519 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.889892 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.899803 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900266 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900288 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900312 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="init" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900328 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="init" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900337 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900343 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900353 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900359 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900367 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900372 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900387 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900400 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="mysql-bootstrap" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900407 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="mysql-bootstrap" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900416 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900422 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900442 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900458 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900464 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900474 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900479 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900494 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900499 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900673 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900682 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900690 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900700 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900708 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900720 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900744 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900755 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900762 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900772 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.902096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.902715 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.910236 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.912257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.915600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.924836 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.925059 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-574d5c476f-sq4hm" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" containerID="cri-o://0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.939754 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.948653 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.959755 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.959781 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.962970 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.968601 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.976689 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.978282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.980656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.986166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data" (OuterVolumeSpecName: "config-data") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.987515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.997275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.003945 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.005473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data" (OuterVolumeSpecName: "config-data") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.005706 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-k2xgr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-3705-account-create-update-c72qw" podUID="9e649171-680c-445a-b418-734a5c7322e3" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.008669 4885 scope.go:117] "RemoveContainer" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.009546 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"ce3508e5-7126-47b0-a598-da6515457cb7\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"ce3508e5-7126-47b0-a598-da6515457cb7\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065345 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065362 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065371 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065379 4885 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065388 4885 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065398 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065787 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce3508e5-7126-47b0-a598-da6515457cb7" (UID: "ce3508e5-7126-47b0-a598-da6515457cb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.073842 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz" (OuterVolumeSpecName: "kube-api-access-hs4qz") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "kube-api-access-hs4qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.104895 4885 scope.go:117] "RemoveContainer" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.106834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d" (OuterVolumeSpecName: "kube-api-access-dkg9d") pod "ce3508e5-7126-47b0-a598-da6515457cb7" (UID: "ce3508e5-7126-47b0-a598-da6515457cb7"). InnerVolumeSpecName "kube-api-access-dkg9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.137124 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data" (OuterVolumeSpecName: "config-data") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.152686 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155180 4885 scope.go:117] "RemoveContainer" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.155602 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": container with ID starting with e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5 not found: ID does not exist" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155641 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} err="failed to get container status \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": rpc error: code = NotFound desc = could not find container \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": container with ID starting with e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5 not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155669 4885 scope.go:117] "RemoveContainer" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.155885 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": container with ID starting with 4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c not found: ID does not exist" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155904 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} err="failed to get container status \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": rpc error: code = NotFound desc = could not find container \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": container with ID starting with 4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.159131 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.166892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167404 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167724 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167798 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167856 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167910 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.167935 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168190 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:32.168157645 +0000 UTC m=+1493.564211668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168567 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168659 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.668648838 +0000 UTC m=+1490.064702861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.172693 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.172750 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.672733056 +0000 UTC m=+1490.068787079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.178737 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" containerID="cri-o://88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" gracePeriod=30 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.204346 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.214823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.217352 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-conmon-46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fb4d53_4722_4f72_9f1a_99ee2b637f6e.slice/crio-36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-conmon-a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c3ea8e_9683_45b9_805b_d1049840b0da.slice/crio-72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c3ea8e_9683_45b9_805b_d1049840b0da.slice/crio-conmon-72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.253847 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.259154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.269829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.269896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"2242ad5f-8a7e-4017-8441-6d05b2c94930\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270422 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270859 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" (UID: "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270892 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2242ad5f-8a7e-4017-8441-6d05b2c94930" (UID: "2242ad5f-8a7e-4017-8441-6d05b2c94930"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.340667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"36678078-1658-4edc-a256-3f0bb8d23ed8\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371589 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"36678078-1658-4edc-a256-3f0bb8d23ed8\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"2242ad5f-8a7e-4017-8441-6d05b2c94930\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.372217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36678078-1658-4edc-a256-3f0bb8d23ed8" (UID: "36678078-1658-4edc-a256-3f0bb8d23ed8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373109 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373284 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373366 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.374935 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh" (OuterVolumeSpecName: "kube-api-access-bhdsh") pod "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" (UID: "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525"). InnerVolumeSpecName "kube-api-access-bhdsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.375499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27" (OuterVolumeSpecName: "kube-api-access-9hl27") pod "2242ad5f-8a7e-4017-8441-6d05b2c94930" (UID: "2242ad5f-8a7e-4017-8441-6d05b2c94930"). InnerVolumeSpecName "kube-api-access-9hl27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.376316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl" (OuterVolumeSpecName: "kube-api-access-rfcbl") pod "36678078-1658-4edc-a256-3f0bb8d23ed8" (UID: "36678078-1658-4edc-a256-3f0bb8d23ed8"). InnerVolumeSpecName "kube-api-access-rfcbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.423881 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.424453 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425005 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425072 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425449 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.426986 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.429243 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.429314 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475677 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476051 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476706 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476726 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476740 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.487824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r" (OuterVolumeSpecName: "kube-api-access-bqx6r") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-api-access-bqx6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.507314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" event={"ID":"2242ad5f-8a7e-4017-8441-6d05b2c94930","Type":"ContainerDied","Data":"82162e312894c7024d5a8960ccd0eb1055e87a8eb4f79ab3a9a1cc489e5a6576"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.507367 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.511655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-qrjwk" event={"ID":"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525","Type":"ContainerDied","Data":"34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.511778 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.515714 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.519302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-ss6cl" event={"ID":"36678078-1658-4edc-a256-3f0bb8d23ed8","Type":"ContainerDied","Data":"81ac023e6b35f76413c8e20a66666b09686bcf823a6a24f94f4a77215fb4b2b2"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.519314 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527649 4885 scope.go:117] "RemoveContainer" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531610 4885 generic.go:334] "Generic (PLEG): container finished" podID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerDied","Data":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerDied","Data":"c280fe9fe13ec4f9da8c09591afb05298340012fc23ed25819dbc3970dce1bc0"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531737 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.534147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.535288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.535421 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.552523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-vbmkj" event={"ID":"ce3508e5-7126-47b0-a598-da6515457cb7","Type":"ContainerDied","Data":"92b9ea85ebe6946aa9aeb6553980bbeaa2d164cb127da6a8cd1b41eb2565b1b7"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.552596 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556103 4885 generic.go:334] "Generic (PLEG): container finished" podID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" exitCode=2 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556229 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerDied","Data":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556860 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerDied","Data":"c766b62f5cceadf0886905919f92953b9185d1630ad51bf23b383898036558fd"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561208 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561241 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" exitCode=2 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561253 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.562255 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.563270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578659 4885 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578697 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578714 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578741 4885 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.634511 4885 scope.go:117] "RemoveContainer" containerID="81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.645098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.684017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.684093 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.684447 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.684514 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.684495924 +0000 UTC m=+1491.080549947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.690180 4885 scope.go:117] "RemoveContainer" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.690308 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.690370 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.69035167 +0000 UTC m=+1491.086405693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.702837 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.797994 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.798060 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:30.798047845 +0000 UTC m=+1492.194101868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816126 4885 scope.go:117] "RemoveContainer" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.816674 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": container with ID starting with bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6 not found: ID does not exist" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816719 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} err="failed to get container status \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": rpc error: code = NotFound desc = could not find container \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": container with ID starting with bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6 not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816745 4885 scope.go:117] "RemoveContainer" containerID="52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.819501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.835705 4885 scope.go:117] "RemoveContainer" containerID="67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.841135 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.846955 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.852291 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.859061 4885 scope.go:117] "RemoveContainer" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.862237 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.893004 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.898294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.906843 4885 scope.go:117] "RemoveContainer" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.908797 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": container with ID starting with 72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b not found: ID does not exist" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.908861 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} err="failed to get container status \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": rpc error: code = NotFound desc = could not find container \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": container with ID starting with 72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.930319 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.939802 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.956417 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.963751 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.002008 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.014114 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.014223 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.022618 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.024965 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": read tcp 10.217.0.2:60334->10.217.0.170:8778: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.025190 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": read tcp 10.217.0.2:60324->10.217.0.170:8778: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.028817 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.063185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.102958 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"619a568c-d0c3-408b-96c1-39a3a769d1ad\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"619a568c-d0c3-408b-96c1-39a3a769d1ad\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "619a568c-d0c3-408b-96c1-39a3a769d1ad" (UID: "619a568c-d0c3-408b-96c1-39a3a769d1ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103809 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09db13b9-d564-49c9-b383-5fbfe0e43c9b" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.104102 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.104123 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.108683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz" (OuterVolumeSpecName: "kube-api-access-gd8sz") pod "09db13b9-d564-49c9-b383-5fbfe0e43c9b" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b"). InnerVolumeSpecName "kube-api-access-gd8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.111135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f" (OuterVolumeSpecName: "kube-api-access-hqv7f") pod "619a568c-d0c3-408b-96c1-39a3a769d1ad" (UID: "619a568c-d0c3-408b-96c1-39a3a769d1ad"). InnerVolumeSpecName "kube-api-access-hqv7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.176627 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:53768->10.217.0.210:8775: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.176653 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:53760->10.217.0.210:8775: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.182540 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.191827 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.168:8776/healthcheck\": read tcp 10.217.0.2:57112->10.217.0.168:8776: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.205739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.205909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206575 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206588 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.207166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data" (OuterVolumeSpecName: "config-data") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.213285 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.218515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n" (OuterVolumeSpecName: "kube-api-access-q554n") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "kube-api-access-q554n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.241654 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.276419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307447 4885 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307478 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307488 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307498 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307506 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.338518 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.382780 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" path="/var/lib/kubelet/pods/2242ad5f-8a7e-4017-8441-6d05b2c94930/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.384308 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" path="/var/lib/kubelet/pods/321f89cf-ed1f-4f10-a198-e55c23171363/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.385674 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36678078-1658-4edc-a256-3f0bb8d23ed8" path="/var/lib/kubelet/pods/36678078-1658-4edc-a256-3f0bb8d23ed8/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.386067 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" path="/var/lib/kubelet/pods/43dd77c8-6951-423a-9334-502f66c3d1b5/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.387067 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" path="/var/lib/kubelet/pods/60f9821e-e554-4594-bfb2-9521cd3c171a/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.387598 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" path="/var/lib/kubelet/pods/63c3ea8e-9683-45b9-805b-d1049840b0da/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.388390 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" path="/var/lib/kubelet/pods/8b3418f5-a92a-4fe6-b0ea-929b54ecb052/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.390093 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" path="/var/lib/kubelet/pods/90fb4d53-4722-4f72-9f1a-99ee2b637f6e/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.392759 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" path="/var/lib/kubelet/pods/925797ff-e1b0-4df7-83db-2091264a4bb8/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.393303 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" path="/var/lib/kubelet/pods/9bbdf164-51e7-4faf-986b-fba5044fad2b/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.394301 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3508e5-7126-47b0-a598-da6515457cb7" path="/var/lib/kubelet/pods/ce3508e5-7126-47b0-a598-da6515457cb7/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.394672 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" path="/var/lib/kubelet/pods/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.395290 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" path="/var/lib/kubelet/pods/d768ed9e-b089-4308-befc-e3bd6aa68683/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.397268 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" path="/var/lib/kubelet/pods/f7884923-e1d5-4b4d-a285-680bfbe38277/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.403222 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.408494 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.408596 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.411071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3e6330b-4e2d-44ca-b9be-d36b2f613571" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.415032 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2" (OuterVolumeSpecName: "kube-api-access-7zhx2") pod "a3e6330b-4e2d-44ca-b9be-d36b2f613571" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571"). InnerVolumeSpecName "kube-api-access-7zhx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.440379 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.445476 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.445544 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.510544 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.510572 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.620489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bq8dk" event={"ID":"09db13b9-d564-49c9-b383-5fbfe0e43c9b","Type":"ContainerDied","Data":"8eb8421d6a99d9718a9f12940cde328117f97ad0e5bd5e92cde85c15eeb1502b"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.620584 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.632087 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerID="41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.632165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650308 4885 generic.go:334] "Generic (PLEG): container finished" podID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerID="1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650431 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.661709 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerID="2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.661788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.663314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-zb2lr" event={"ID":"619a568c-d0c3-408b-96c1-39a3a769d1ad","Type":"ContainerDied","Data":"ecad256f4e94f24cfab1fe144e2c009ebd76a6803d21f421f8c4cfc3079aa401"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.663415 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.675734 4885 generic.go:334] "Generic (PLEG): container finished" podID="a083a431-5afc-4289-a5cf-625bc619465e" containerID="4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.675791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678150 4885 generic.go:334] "Generic (PLEG): container finished" podID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerDied","Data":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerDied","Data":"d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678229 4885 scope.go:117] "RemoveContainer" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678313 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.682966 4885 generic.go:334] "Generic (PLEG): container finished" podID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerID="f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.683027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.698141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.699610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" event={"ID":"a3e6330b-4e2d-44ca-b9be-d36b2f613571","Type":"ContainerDied","Data":"17603ca27d7ecde56e3c1a978d3f1cf7aefc5c4500d55699231533afe9deacd5"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.711635 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714111 4885 generic.go:334] "Generic (PLEG): container finished" podID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerID="7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714211 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714605 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.715026 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.715083 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.715065117 +0000 UTC m=+1493.111119140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.718161 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.718218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.718202421 +0000 UTC m=+1493.114256444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.726318 4885 scope.go:117] "RemoveContainer" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.726831 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": container with ID starting with f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb not found: ID does not exist" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.726875 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} err="failed to get container status \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": rpc error: code = NotFound desc = could not find container \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": container with ID starting with f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb not found: ID does not exist" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.728278 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.772907 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816310 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.820619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts" (OuterVolumeSpecName: "scripts") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.821035 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs" (OuterVolumeSpecName: "logs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.845072 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.845111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl" (OuterVolumeSpecName: "kube-api-access-44fnl") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "kube-api-access-44fnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.860956 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.900222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data" (OuterVolumeSpecName: "config-data") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.911143 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.914087 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943407 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943500 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943694 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943803 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5" (OuterVolumeSpecName: "kube-api-access-w82z5") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "kube-api-access-w82z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944821 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs" (OuterVolumeSpecName: "logs") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944985 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945015 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945028 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945038 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945047 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945057 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.950077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs" (OuterVolumeSpecName: "logs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.969839 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.973783 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.976790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2" (OuterVolumeSpecName: "kube-api-access-xxxc2") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "kube-api-access-xxxc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.992648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.999535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.005243 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.021146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.042974 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045752 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045843 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046002 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046153 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046339 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046764 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047064 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs" (OuterVolumeSpecName: "logs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047620 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047650 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047665 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047677 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047693 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047706 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.055467 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.055498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data" (OuterVolumeSpecName: "config-data") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.087290 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.094875 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.110706 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48972->10.217.0.167:9311: read: connection reset by peer" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.111055 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48956->10.217.0.167:9311: read: connection reset by peer" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.116591 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.128628 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts" (OuterVolumeSpecName: "scripts") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.128770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6" (OuterVolumeSpecName: "kube-api-access-f9zl6") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "kube-api-access-f9zl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.132541 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.139053 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151429 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151505 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151645 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs" (OuterVolumeSpecName: "logs") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152992 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153398 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153415 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153425 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153434 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153444 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153453 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153461 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153470 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153478 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153486 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6" (OuterVolumeSpecName: "kube-api-access-7tpp6") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "kube-api-access-7tpp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168448 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts" (OuterVolumeSpecName: "scripts") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.209285 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.224781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.229869 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.235972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.236889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data" (OuterVolumeSpecName: "config-data") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.247669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254950 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254997 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255140 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255166 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255209 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255286 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255313 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255659 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255677 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255686 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255695 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255705 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255714 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255723 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.256567 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.256692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs" (OuterVolumeSpecName: "logs") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.257879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.258684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.259951 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.262079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.271356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.271539 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf" (OuterVolumeSpecName: "kube-api-access-jc6jf") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "kube-api-access-jc6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.272212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5" (OuterVolumeSpecName: "kube-api-access-hzmz5") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "kube-api-access-hzmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.280522 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts" (OuterVolumeSpecName: "scripts") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.284413 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.285217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.291183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.291544 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.312196 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.314330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.334115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.344975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.346436 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357623 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357671 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357680 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357688 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357697 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357706 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357715 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357722 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357733 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357741 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357753 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357762 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357770 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357778 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357786 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357794 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357803 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357811 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.359903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.369997 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data" (OuterVolumeSpecName: "config-data") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.372034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data" (OuterVolumeSpecName: "config-data") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.373180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data" (OuterVolumeSpecName: "config-data") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374314 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374630 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374959 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.387878 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.390640 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1f46cb2-c95d-40f5-9acc-720e094b91bc/ovn-northd/0.log" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.390702 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459902 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460450 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460483 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460501 4885 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460526 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460543 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460560 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460576 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460594 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.461320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts" (OuterVolumeSpecName: "scripts") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.462043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config" (OuterVolumeSpecName: "config") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.465143 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.470346 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q" (OuterVolumeSpecName: "kube-api-access-pbl7q") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "kube-api-access-pbl7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.521042 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.528037 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.528291 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562161 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562203 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562224 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562240 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562298 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562313 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562326 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.650392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.656618 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.661983 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723868 4885 scope.go:117] "RemoveContainer" containerID="4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723995 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734543 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"8290e58829785cfd7645e5b7ea06bfd203515f9adadd2b8e8b4383fbc9129293"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734763 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736703 4885 generic.go:334] "Generic (PLEG): container finished" podID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736817 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.741434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"d206b01c706625f3b6d24a81cff0491b35107018670692e53d89b4cfafe0b053"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.741512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750460 4885 generic.go:334] "Generic (PLEG): container finished" podID="945717bc-405f-4628-934c-66e4500f56f0" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750521 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerDied","Data":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerDied","Data":"4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750597 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.755508 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.755508 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"f56864cf75c0bf77d3e8eee5fd2c82834b4c4219c0b0d60077918b5a5fcf0612"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.760810 4885 scope.go:117] "RemoveContainer" containerID="701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765436 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765473 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765567 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765619 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765638 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765683 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.768007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.775147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs" (OuterVolumeSpecName: "logs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.776251 4885 generic.go:334] "Generic (PLEG): container finished" podID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777498 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"4d0a6fa6c058e8e3d990a208a072ec9c1c565777e02360b62370fc36d2e37246"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777966 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.787619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788392 4885 generic.go:334] "Generic (PLEG): container finished" podID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerDied","Data":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerDied","Data":"e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788536 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.790940 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.791392 4885 scope.go:117] "RemoveContainer" containerID="f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793616 4885 generic.go:334] "Generic (PLEG): container finished" podID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"9666e26b13c4933935ee0abcb40c76da8cace1d3e077db5278af8135676f6e1f"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.795184 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd" (OuterVolumeSpecName: "kube-api-access-26jxd") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "kube-api-access-26jxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798196 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1f46cb2-c95d-40f5-9acc-720e094b91bc/ovn-northd/0.log" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798242 4885 generic.go:334] "Generic (PLEG): container finished" podID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" exitCode=139 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"cd1df5e26dfde01021643639b3d30a9000c123fa83c48692684173ba1b046531"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798368 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.806269 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.806941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5" (OuterVolumeSpecName: "kube-api-access-nkbq5") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "kube-api-access-nkbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.807060 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.809677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb" (OuterVolumeSpecName: "kube-api-access-nwjwb") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "kube-api-access-nwjwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.811963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.813701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p" (OuterVolumeSpecName: "kube-api-access-wt46p") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "kube-api-access-wt46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.816068 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.832360 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts" (OuterVolumeSpecName: "scripts") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.836142 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.847965 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.848721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.855135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data" (OuterVolumeSpecName: "config-data") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.862788 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870459 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870513 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870523 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870531 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870585 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870596 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870604 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870614 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870626 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870635 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.882011 4885 scope.go:117] "RemoveContainer" containerID="2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.882449 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.890632 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.891955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.903804 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.913019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.914492 4885 scope.go:117] "RemoveContainer" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.915825 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data" (OuterVolumeSpecName: "config-data") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.917461 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.925321 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.927986 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data" (OuterVolumeSpecName: "config-data") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.947362 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.950131 4885 scope.go:117] "RemoveContainer" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.956664 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.956833 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.961762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.962950 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.968497 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.969413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972599 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972696 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972751 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972802 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972852 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972932 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972992 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.973043 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972973 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.978979 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.982298 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.982540 4885 scope.go:117] "RemoveContainer" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: E0308 19:56:30.983185 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": container with ID starting with 88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67 not found: ID does not exist" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983231 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} err="failed to get container status \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": rpc error: code = NotFound desc = could not find container \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": container with ID starting with 88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67 not found: ID does not exist" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983280 4885 scope.go:117] "RemoveContainer" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: E0308 19:56:30.983639 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": container with ID starting with 8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2 not found: ID does not exist" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983674 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} err="failed to get container status \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": rpc error: code = NotFound desc = could not find container \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": container with ID starting with 8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2 not found: ID does not exist" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983699 4885 scope.go:117] "RemoveContainer" containerID="41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.992826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data" (OuterVolumeSpecName: "config-data") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.001613 4885 scope.go:117] "RemoveContainer" containerID="4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.022119 4885 scope.go:117] "RemoveContainer" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038095 4885 scope.go:117] "RemoveContainer" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.038645 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": container with ID starting with aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0 not found: ID does not exist" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038688 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} err="failed to get container status \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": rpc error: code = NotFound desc = could not find container \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": container with ID starting with aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038721 4885 scope.go:117] "RemoveContainer" containerID="7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.059027 4885 scope.go:117] "RemoveContainer" containerID="48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.074087 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.091469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.097688 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.099632 4885 scope.go:117] "RemoveContainer" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.119962 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.137228 4885 scope.go:117] "RemoveContainer" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.140122 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.158433 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.168377 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.173639 4885 scope.go:117] "RemoveContainer" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.174214 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": container with ID starting with ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5 not found: ID does not exist" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.174333 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} err="failed to get container status \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": rpc error: code = NotFound desc = could not find container \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": container with ID starting with ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.175149 4885 scope.go:117] "RemoveContainer" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.175303 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.176243 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": container with ID starting with fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12 not found: ID does not exist" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.176275 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} err="failed to get container status \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": rpc error: code = NotFound desc = could not find container \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": container with ID starting with fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.176295 4885 scope.go:117] "RemoveContainer" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.182229 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.215996 4885 scope.go:117] "RemoveContainer" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.220032 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": container with ID starting with 9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b not found: ID does not exist" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.220069 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} err="failed to get container status \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": rpc error: code = NotFound desc = could not find container \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": container with ID starting with 9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.220091 4885 scope.go:117] "RemoveContainer" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.245010 4885 scope.go:117] "RemoveContainer" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.261815 4885 scope.go:117] "RemoveContainer" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.262289 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": container with ID starting with 786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44 not found: ID does not exist" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.262336 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} err="failed to get container status \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": rpc error: code = NotFound desc = could not find container \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": container with ID starting with 786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.262368 4885 scope.go:117] "RemoveContainer" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.263392 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": container with ID starting with d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b not found: ID does not exist" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.263421 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} err="failed to get container status \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": rpc error: code = NotFound desc = could not find container \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": container with ID starting with d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.263443 4885 scope.go:117] "RemoveContainer" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.290719 4885 scope.go:117] "RemoveContainer" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.308365 4885 scope.go:117] "RemoveContainer" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.309276 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": container with ID starting with 619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0 not found: ID does not exist" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309313 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} err="failed to get container status \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": rpc error: code = NotFound desc = could not find container \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": container with ID starting with 619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309338 4885 scope.go:117] "RemoveContainer" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.309630 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": container with ID starting with e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f not found: ID does not exist" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309651 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} err="failed to get container status \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": rpc error: code = NotFound desc = could not find container \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": container with ID starting with e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309667 4885 scope.go:117] "RemoveContainer" containerID="2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.353530 4885 scope.go:117] "RemoveContainer" containerID="8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.382185 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" path="/var/lib/kubelet/pods/09db13b9-d564-49c9-b383-5fbfe0e43c9b/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.382381 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.382598 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" path="/var/lib/kubelet/pods/13df70e2-1a9e-4d81-b23b-c461291bce93/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.382720 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:39.382536627 +0000 UTC m=+1500.778590640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.384382 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" path="/var/lib/kubelet/pods/35e55887-f8af-4c57-820d-c46d0ee9cd9f/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.386519 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" path="/var/lib/kubelet/pods/50b429e9-fb10-48ba-b15c-ec25d57e707a/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.387846 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" path="/var/lib/kubelet/pods/619a568c-d0c3-408b-96c1-39a3a769d1ad/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.388306 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" path="/var/lib/kubelet/pods/64baa35e-d1c2-48fe-a7a1-d0a4d1485908/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.389423 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" path="/var/lib/kubelet/pods/719b68df-d1ac-49e5-ac34-dfa3ba33c97f/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.390903 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" path="/var/lib/kubelet/pods/93f52f98-0e26-4fc1-a9af-f580531f8550/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.391428 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945717bc-405f-4628-934c-66e4500f56f0" path="/var/lib/kubelet/pods/945717bc-405f-4628-934c-66e4500f56f0/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.392379 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e649171-680c-445a-b418-734a5c7322e3" path="/var/lib/kubelet/pods/9e649171-680c-445a-b418-734a5c7322e3/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.392709 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a083a431-5afc-4289-a5cf-625bc619465e" path="/var/lib/kubelet/pods/a083a431-5afc-4289-a5cf-625bc619465e/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.393427 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" path="/var/lib/kubelet/pods/a3e6330b-4e2d-44ca-b9be-d36b2f613571/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.393948 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" path="/var/lib/kubelet/pods/cdd926a8-442c-4f63-bb36-3e6a425436c2/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.395063 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" path="/var/lib/kubelet/pods/da1d62ba-4033-4906-87c1-d673c1ab8637/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.395634 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" path="/var/lib/kubelet/pods/e4ca493a-f707-45c3-b457-1a1053c3dfe5/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.396346 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" path="/var/lib/kubelet/pods/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.397643 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" path="/var/lib/kubelet/pods/f1f46cb2-c95d-40f5-9acc-720e094b91bc/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.420862 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483622 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483753 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483964 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.488296 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg" (OuterVolumeSpecName: "kube-api-access-2q2mg") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "kube-api-access-2q2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.490021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts" (OuterVolumeSpecName: "scripts") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.493318 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.495033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.517533 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data" (OuterVolumeSpecName: "config-data") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.522222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.539089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.542540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.585958 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.585993 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586004 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586014 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586023 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586031 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586040 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586050 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830271 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" exitCode=0 Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830385 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerDied","Data":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830451 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerDied","Data":"bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61"} Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830478 4885 scope.go:117] "RemoveContainer" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.880065 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.882257 4885 scope.go:117] "RemoveContainer" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.882830 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": container with ID starting with 0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421 not found: ID does not exist" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.882885 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} err="failed to get container status \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": rpc error: code = NotFound desc = could not find container \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": container with ID starting with 0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.883748 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:32 crc kubenswrapper[4885]: E0308 19:56:32.195658 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:32 crc kubenswrapper[4885]: E0308 19:56:32.195791 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:40.195761657 +0000 UTC m=+1501.591815710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.819791 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.819844 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.851525 4885 generic.go:334] "Generic (PLEG): container finished" podID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerID="ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" exitCode=0 Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.851592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.014568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111109 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111648 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111706 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.113470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.113831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.116500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info" (OuterVolumeSpecName: "pod-info") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.117022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.117282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.132201 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h" (OuterVolumeSpecName: "kube-api-access-wjj2h") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "kube-api-access-wjj2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.136307 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.140187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.155061 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data" (OuterVolumeSpecName: "config-data") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.164303 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf" (OuterVolumeSpecName: "server-conf") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.200518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213352 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213400 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213422 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213442 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213461 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213509 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213527 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213548 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213566 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213582 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213598 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.221021 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.241942 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314777 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314984 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315117 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315325 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315404 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315475 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315221 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316021 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316054 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316208 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316948 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.317572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.318048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319315 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info" (OuterVolumeSpecName: "pod-info") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319869 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn" (OuterVolumeSpecName: "kube-api-access-h9bwn") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "kube-api-access-h9bwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.334786 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data" (OuterVolumeSpecName: "config-data") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.359202 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf" (OuterVolumeSpecName: "server-conf") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.376328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.380408 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" path="/var/lib/kubelet/pods/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc/volumes" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418213 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418250 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418263 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418679 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418697 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418713 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418750 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.418716 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418763 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418893 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418967 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419141 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419623 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419691 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.420245 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.421519 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.423666 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.423698 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.441745 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.528706 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.885944 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a083cf5-4ca2-440c-840a-6b159151609f" containerID="b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.886022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.887509 4885 generic.go:334] "Generic (PLEG): container finished" podID="a7268474-e124-4139-bf24-6b3f605b9511" containerID="6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.887577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.889626 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.889694 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891150 4885 generic.go:334] "Generic (PLEG): container finished" podID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"654fe72412f8a73fefea3c7f4b820f3cc3985166e76d795cc9ca36d6cf741354"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891266 4885 scope.go:117] "RemoveContainer" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.894085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.894176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.922730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.928817 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.941201 4885 scope.go:117] "RemoveContainer" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.946122 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.959480 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.980125 4885 scope.go:117] "RemoveContainer" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.980968 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": container with ID starting with c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e not found: ID does not exist" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.981013 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} err="failed to get container status \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": rpc error: code = NotFound desc = could not find container \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": container with ID starting with c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e not found: ID does not exist" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.981041 4885 scope.go:117] "RemoveContainer" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.984196 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": container with ID starting with 67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0 not found: ID does not exist" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.984230 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} err="failed to get container status \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": rpc error: code = NotFound desc = could not find container \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": container with ID starting with 67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0 not found: ID does not exist" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.984252 4885 scope.go:117] "RemoveContainer" containerID="ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.015138 4885 scope.go:117] "RemoveContainer" containerID="f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.090442 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.135968 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136854 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136894 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.137104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.137474 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.141870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts" (OuterVolumeSpecName: "scripts") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.142197 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.146445 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb" (OuterVolumeSpecName: "kube-api-access-n89rb") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "kube-api-access-n89rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.155025 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.162370 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.166532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.211130 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.211402 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.232116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data" (OuterVolumeSpecName: "config-data") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238143 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238222 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238255 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238305 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238332 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238453 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs" (OuterVolumeSpecName: "logs") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs" (OuterVolumeSpecName: "logs") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239683 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239701 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239711 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239738 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239748 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239765 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239774 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239782 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.240835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd" (OuterVolumeSpecName: "kube-api-access-gfdxd") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "kube-api-access-gfdxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.241892 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc" (OuterVolumeSpecName: "kube-api-access-td4rc") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "kube-api-access-td4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.243696 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.245942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.254788 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.259378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.272260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data" (OuterVolumeSpecName: "config-data") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.276292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data" (OuterVolumeSpecName: "config-data") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341303 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341346 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341364 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341381 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341398 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341415 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341433 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341451 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915083 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915175 4885 scope.go:117] "RemoveContainer" containerID="b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915108 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.922074 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"7d126567b856b73925e9e50b783a515a23fdff84d4ca27fd2089e38d86b58980"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.922218 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.927846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"a1d4f26c989a88dcb6b8292fefbf9776a8ae2f04c4981fe1a6564019613a69ba"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.928067 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.959651 4885 scope.go:117] "RemoveContainer" containerID="d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.983025 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.988820 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.997180 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.005742 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.012144 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.017573 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.023621 4885 scope.go:117] "RemoveContainer" containerID="6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.045703 4885 scope.go:117] "RemoveContainer" containerID="3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.073064 4885 scope.go:117] "RemoveContainer" containerID="46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.094599 4885 scope.go:117] "RemoveContainer" containerID="c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.116902 4885 scope.go:117] "RemoveContainer" containerID="7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.136149 4885 scope.go:117] "RemoveContainer" containerID="a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.386414 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" path="/var/lib/kubelet/pods/01dc1fd5-4e2f-4129-9452-ed50fa1d182b/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.387153 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" path="/var/lib/kubelet/pods/2a083cf5-4ca2-440c-840a-6b159151609f/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.392512 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" path="/var/lib/kubelet/pods/6a1f465c-123b-455f-8bd8-720d3f8a4bef/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.395617 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" path="/var/lib/kubelet/pods/96257eac-42ec-44cf-80be-9be68c0ebb1b/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.397298 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7268474-e124-4139-bf24-6b3f605b9511" path="/var/lib/kubelet/pods/a7268474-e124-4139-bf24-6b3f605b9511/volumes" Mar 08 19:56:38 crc kubenswrapper[4885]: I0308 19:56:38.006068 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: i/o timeout" Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.418765 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.419597 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.419901 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.420033 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.420217 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.421260 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.424175 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.424221 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.021621 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerID="17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" exitCode=0 Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.021734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332"} Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.022416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc"} Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.022437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.109790 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165831 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165893 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.191792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd" (OuterVolumeSpecName: "kube-api-access-sdbrd") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "kube-api-access-sdbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.193298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.220796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.230848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240056 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config" (OuterVolumeSpecName: "config") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267713 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267771 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267794 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267815 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267836 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267854 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267872 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.035968 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.095002 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.101100 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.393739 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" path="/var/lib/kubelet/pods/d1b91750-253e-46eb-9a1c-f7208dab2496/volumes" Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.419443 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420265 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420799 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420852 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.423707 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.427368 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.429988 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.430101 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.419190 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.420511 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.421289 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.421396 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.424652 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.427437 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.429741 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.429913 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.419846 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422142 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422173 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422977 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.423074 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.423622 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.425411 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.425470 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.189691 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.191757 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" exitCode=137 Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.191826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1"} Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.545486 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.546123 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622589 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622621 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622643 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib" (OuterVolumeSpecName: "var-lib") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log" (OuterVolumeSpecName: "var-log") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run" (OuterVolumeSpecName: "var-run") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622736 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622758 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623046 4885 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623065 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623075 4885 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623085 4885 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623893 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts" (OuterVolumeSpecName: "scripts") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.640205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx" (OuterVolumeSpecName: "kube-api-access-66rvx") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "kube-api-access-66rvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.708283 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.723909 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.723952 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825549 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825764 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.826384 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock" (OuterVolumeSpecName: "lock") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.826759 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache" (OuterVolumeSpecName: "cache") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.829699 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.829956 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6" (OuterVolumeSpecName: "kube-api-access-22mr6") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "kube-api-access-22mr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.831603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.927996 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928051 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928075 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928099 4885 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928120 4885 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.956343 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.030254 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.182557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.204852 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206407 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"e8b9e6003711ba0073f7cced036d1550a5aac01aa3276b7f4a1f8ca2c14ba942"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206464 4885 scope.go:117] "RemoveContainer" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206486 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.219983 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" exitCode=137 Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220024 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.239561 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.241569 4885 scope.go:117] "RemoveContainer" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.284995 4885 scope.go:117] "RemoveContainer" containerID="4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.291768 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.298911 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.306224 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.310411 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.322815 4885 scope.go:117] "RemoveContainer" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.351509 4885 scope.go:117] "RemoveContainer" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.374899 4885 scope.go:117] "RemoveContainer" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.397591 4885 scope.go:117] "RemoveContainer" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.422245 4885 scope.go:117] "RemoveContainer" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.453822 4885 scope.go:117] "RemoveContainer" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.481710 4885 scope.go:117] "RemoveContainer" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.512579 4885 scope.go:117] "RemoveContainer" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.534403 4885 scope.go:117] "RemoveContainer" containerID="626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.546491 4885 scope.go:117] "RemoveContainer" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.627911 4885 scope.go:117] "RemoveContainer" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.672145 4885 scope.go:117] "RemoveContainer" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.706092 4885 scope.go:117] "RemoveContainer" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.749975 4885 scope.go:117] "RemoveContainer" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.781385 4885 scope.go:117] "RemoveContainer" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.820284 4885 scope.go:117] "RemoveContainer" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.853448 4885 scope.go:117] "RemoveContainer" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.854108 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": container with ID starting with 9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2 not found: ID does not exist" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854177 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} err="failed to get container status \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": rpc error: code = NotFound desc = could not find container \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": container with ID starting with 9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854223 4885 scope.go:117] "RemoveContainer" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.854617 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": container with ID starting with 64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb not found: ID does not exist" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854672 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} err="failed to get container status \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": rpc error: code = NotFound desc = could not find container \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": container with ID starting with 64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854713 4885 scope.go:117] "RemoveContainer" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.855144 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": container with ID starting with 7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca not found: ID does not exist" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855182 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} err="failed to get container status \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": rpc error: code = NotFound desc = could not find container \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": container with ID starting with 7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855210 4885 scope.go:117] "RemoveContainer" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.855591 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": container with ID starting with 6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474 not found: ID does not exist" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855639 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} err="failed to get container status \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": rpc error: code = NotFound desc = could not find container \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": container with ID starting with 6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855668 4885 scope.go:117] "RemoveContainer" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.856064 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": container with ID starting with 37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a not found: ID does not exist" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856098 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} err="failed to get container status \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": rpc error: code = NotFound desc = could not find container \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": container with ID starting with 37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856121 4885 scope.go:117] "RemoveContainer" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.856503 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": container with ID starting with d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3 not found: ID does not exist" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856553 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} err="failed to get container status \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": rpc error: code = NotFound desc = could not find container \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": container with ID starting with d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856583 4885 scope.go:117] "RemoveContainer" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.857085 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": container with ID starting with 2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd not found: ID does not exist" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857144 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} err="failed to get container status \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": rpc error: code = NotFound desc = could not find container \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": container with ID starting with 2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857183 4885 scope.go:117] "RemoveContainer" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.857716 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": container with ID starting with e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f not found: ID does not exist" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857763 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} err="failed to get container status \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": rpc error: code = NotFound desc = could not find container \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": container with ID starting with e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857789 4885 scope.go:117] "RemoveContainer" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.858332 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": container with ID starting with 6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81 not found: ID does not exist" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.858387 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} err="failed to get container status \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": rpc error: code = NotFound desc = could not find container \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": container with ID starting with 6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.858430 4885 scope.go:117] "RemoveContainer" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.859273 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": container with ID starting with 25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd not found: ID does not exist" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859337 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} err="failed to get container status \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": rpc error: code = NotFound desc = could not find container \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": container with ID starting with 25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859370 4885 scope.go:117] "RemoveContainer" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.859775 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": container with ID starting with a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc not found: ID does not exist" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859966 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} err="failed to get container status \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": rpc error: code = NotFound desc = could not find container \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": container with ID starting with a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.860122 4885 scope.go:117] "RemoveContainer" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.861510 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": container with ID starting with 3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653 not found: ID does not exist" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.861565 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} err="failed to get container status \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": rpc error: code = NotFound desc = could not find container \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": container with ID starting with 3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.861604 4885 scope.go:117] "RemoveContainer" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.862140 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": container with ID starting with 904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6 not found: ID does not exist" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862178 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} err="failed to get container status \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": rpc error: code = NotFound desc = could not find container \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": container with ID starting with 904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862203 4885 scope.go:117] "RemoveContainer" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.862569 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": container with ID starting with 5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82 not found: ID does not exist" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862617 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} err="failed to get container status \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": rpc error: code = NotFound desc = could not find container \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": container with ID starting with 5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862647 4885 scope.go:117] "RemoveContainer" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.863027 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": container with ID starting with 803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542 not found: ID does not exist" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.863070 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} err="failed to get container status \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": rpc error: code = NotFound desc = could not find container \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": container with ID starting with 803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542 not found: ID does not exist" Mar 08 19:56:57 crc kubenswrapper[4885]: I0308 19:56:57.390480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" path="/var/lib/kubelet/pods/88c2918a-548b-4b78-a34c-2aa2969ee2cd/volumes" Mar 08 19:56:57 crc kubenswrapper[4885]: I0308 19:56:57.392451 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" path="/var/lib/kubelet/pods/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd/volumes" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.568324 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda1d62ba-4033-4906-87c1-d673c1ab8637"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda1d62ba-4033-4906-87c1-d673c1ab8637] : Timed out while waiting for systemd to remove kubepods-besteffort-podda1d62ba_4033_4906_87c1_d673c1ab8637.slice" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.687827 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod09db13b9-d564-49c9-b383-5fbfe0e43c9b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod09db13b9-d564-49c9-b383-5fbfe0e43c9b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod09db13b9_d564_49c9_b383_5fbfe0e43c9b.slice" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.693053 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod619a568c-d0c3-408b-96c1-39a3a769d1ad"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod619a568c-d0c3-408b-96c1-39a3a769d1ad] : Timed out while waiting for systemd to remove kubepods-besteffort-pod619a568c_d0c3_408b_96c1_39a3a769d1ad.slice" Mar 08 19:57:02 crc kubenswrapper[4885]: I0308 19:57:02.818162 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:57:02 crc kubenswrapper[4885]: I0308 19:57:02.818571 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.818634 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.819387 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.819463 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.820362 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.820477 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" gracePeriod=600 Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.680597 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" exitCode=0 Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.680720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.681367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.681401 4885 scope.go:117] "RemoveContainer" containerID="c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.504823 4885 scope.go:117] "RemoveContainer" containerID="3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.549015 4885 scope.go:117] "RemoveContainer" containerID="58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.589641 4885 scope.go:117] "RemoveContainer" containerID="d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.624375 4885 scope.go:117] "RemoveContainer" containerID="57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.649874 4885 scope.go:117] "RemoveContainer" containerID="2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.675636 4885 scope.go:117] "RemoveContainer" containerID="2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.702689 4885 scope.go:117] "RemoveContainer" containerID="e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.735878 4885 scope.go:117] "RemoveContainer" containerID="b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.760214 4885 scope.go:117] "RemoveContainer" containerID="ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158113 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158553 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158578 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158609 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158643 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158667 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158725 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158738 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158758 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158769 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158784 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158815 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158827 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158853 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158875 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158887 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158915 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158969 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158988 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158999 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159015 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159078 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159104 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159116 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159147 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159162 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159173 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159193 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159205 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159224 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159238 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159252 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159264 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159283 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159294 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159310 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159322 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159342 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159353 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server-init" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159385 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server-init" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159406 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159418 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159436 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159468 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159480 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159498 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159509 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159543 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159557 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159568 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159592 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159621 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159633 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159651 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159663 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159684 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159696 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159713 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159725 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159741 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159752 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159767 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159779 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159793 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159804 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159825 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159851 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159863 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159878 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159890 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159907 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159991 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160070 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160160 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160235 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160259 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160272 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160292 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160307 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160325 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160338 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160365 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="mysql-bootstrap" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160377 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="mysql-bootstrap" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160398 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160433 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160447 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160470 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160483 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160522 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160545 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160558 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160576 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160590 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161032 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161058 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161073 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161090 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161106 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161120 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161144 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161158 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161176 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161199 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161221 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161242 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161275 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161298 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161321 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161339 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161357 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161375 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161390 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161413 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161435 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161452 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161471 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161498 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161518 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161541 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161593 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161616 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161640 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161662 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161682 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161705 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161743 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161784 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161806 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161847 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161864 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161886 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161903 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161944 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161963 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161981 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162001 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162021 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162038 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162057 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162074 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162095 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168171 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168452 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168572 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.175477 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.287185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.388578 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.420235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.485453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.997517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:01 crc kubenswrapper[4885]: I0308 19:58:01.006370 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:58:01 crc kubenswrapper[4885]: I0308 19:58:01.983315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerStarted","Data":"5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2"} Mar 08 19:58:02 crc kubenswrapper[4885]: I0308 19:58:02.998158 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f78a7ad-7933-489d-8395-4bb334007a30" containerID="f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa" exitCode=0 Mar 08 19:58:02 crc kubenswrapper[4885]: I0308 19:58:02.998400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerDied","Data":"f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa"} Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.362438 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.451795 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"0f78a7ad-7933-489d-8395-4bb334007a30\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.460972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf" (OuterVolumeSpecName: "kube-api-access-lggxf") pod "0f78a7ad-7933-489d-8395-4bb334007a30" (UID: "0f78a7ad-7933-489d-8395-4bb334007a30"). InnerVolumeSpecName "kube-api-access-lggxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.553446 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") on node \"crc\" DevicePath \"\"" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerDied","Data":"5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2"} Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018789 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.505494 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.528036 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:58:07 crc kubenswrapper[4885]: I0308 19:58:07.383422 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" path="/var/lib/kubelet/pods/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec/volumes" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.019552 4885 scope.go:117] "RemoveContainer" containerID="fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.058466 4885 scope.go:117] "RemoveContainer" containerID="17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.088729 4885 scope.go:117] "RemoveContainer" containerID="f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.142033 4885 scope.go:117] "RemoveContainer" containerID="41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.176712 4885 scope.go:117] "RemoveContainer" containerID="74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.229406 4885 scope.go:117] "RemoveContainer" containerID="d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.259282 4885 scope.go:117] "RemoveContainer" containerID="e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.314141 4885 scope.go:117] "RemoveContainer" containerID="a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.344081 4885 scope.go:117] "RemoveContainer" containerID="1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.389875 4885 scope.go:117] "RemoveContainer" containerID="88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.442278 4885 scope.go:117] "RemoveContainer" containerID="6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.470595 4885 scope.go:117] "RemoveContainer" containerID="6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.510212 4885 scope.go:117] "RemoveContainer" containerID="302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.556063 4885 scope.go:117] "RemoveContainer" containerID="d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.584763 4885 scope.go:117] "RemoveContainer" containerID="33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.867469 4885 scope.go:117] "RemoveContainer" containerID="0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.903561 4885 scope.go:117] "RemoveContainer" containerID="46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.952113 4885 scope.go:117] "RemoveContainer" containerID="ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.981732 4885 scope.go:117] "RemoveContainer" containerID="30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.009532 4885 scope.go:117] "RemoveContainer" containerID="17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.026737 4885 scope.go:117] "RemoveContainer" containerID="83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.044286 4885 scope.go:117] "RemoveContainer" containerID="db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.065101 4885 scope.go:117] "RemoveContainer" containerID="1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.095899 4885 scope.go:117] "RemoveContainer" containerID="eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.116185 4885 scope.go:117] "RemoveContainer" containerID="7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.140255 4885 scope.go:117] "RemoveContainer" containerID="9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.163609 4885 scope.go:117] "RemoveContainer" containerID="4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.180144 4885 scope.go:117] "RemoveContainer" containerID="155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.206307 4885 scope.go:117] "RemoveContainer" containerID="43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.238831 4885 scope.go:117] "RemoveContainer" containerID="2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.173802 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: E0308 20:00:00.174321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.174342 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.174593 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.175349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.179504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.180370 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.181741 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.182112 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.183066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.187074 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.190307 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.191913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.200162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315442 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.419055 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.426554 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.446299 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.447645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.502038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.519032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.871373 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.956756 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: W0308 20:00:00.973516 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd589285c_60a3_4871_9149_7f1f99fc35ee.slice/crio-b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85 WatchSource:0}: Error finding container b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85: Status 404 returned error can't find the container with id b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85 Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.180806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerStarted","Data":"04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.181139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerStarted","Data":"ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.183663 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerStarted","Data":"b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.203790 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" podStartSLOduration=1.203769232 podStartE2EDuration="1.203769232s" podCreationTimestamp="2026-03-08 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:00:01.20215276 +0000 UTC m=+1702.598206793" watchObservedRunningTime="2026-03-08 20:00:01.203769232 +0000 UTC m=+1702.599823265" Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.195755 4885 generic.go:334] "Generic (PLEG): container finished" podID="0336d864-07a3-41ed-9327-8a39d16d667f" containerID="04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4" exitCode=0 Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.195801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerDied","Data":"04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4"} Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.818663 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.818788 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.550429 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691668 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume" (OuterVolumeSpecName: "config-volume") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.696905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2" (OuterVolumeSpecName: "kube-api-access-sfdl2") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "kube-api-access-sfdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.699067 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.792902 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.792983 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.793007 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217126 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerDied","Data":"ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e"} Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217483 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e" Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217232 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:24 crc kubenswrapper[4885]: I0308 20:00:24.420000 4885 generic.go:334] "Generic (PLEG): container finished" podID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerID="a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823" exitCode=0 Mar 08 20:00:24 crc kubenswrapper[4885]: I0308 20:00:24.420114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerDied","Data":"a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823"} Mar 08 20:00:25 crc kubenswrapper[4885]: I0308 20:00:25.802583 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:25 crc kubenswrapper[4885]: I0308 20:00:25.987669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"d589285c-60a3-4871-9149-7f1f99fc35ee\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.001140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst" (OuterVolumeSpecName: "kube-api-access-8tcst") pod "d589285c-60a3-4871-9149-7f1f99fc35ee" (UID: "d589285c-60a3-4871-9149-7f1f99fc35ee"). InnerVolumeSpecName "kube-api-access-8tcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.091004 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443562 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerDied","Data":"b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85"} Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443622 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.873900 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.880808 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 20:00:27 crc kubenswrapper[4885]: I0308 20:00:27.382445 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" path="/var/lib/kubelet/pods/5e836afb-bb6f-4e67-9df6-5bef0273a523/volumes" Mar 08 20:00:32 crc kubenswrapper[4885]: I0308 20:00:32.838313 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:00:32 crc kubenswrapper[4885]: I0308 20:00:32.839024 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.387294 4885 scope.go:117] "RemoveContainer" containerID="0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.465631 4885 scope.go:117] "RemoveContainer" containerID="72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.530116 4885 scope.go:117] "RemoveContainer" containerID="6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.588877 4885 scope.go:117] "RemoveContainer" containerID="5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.649436 4885 scope.go:117] "RemoveContainer" containerID="9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818509 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818896 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818984 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.819871 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.819999 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" gracePeriod=600 Mar 08 20:01:02 crc kubenswrapper[4885]: E0308 20:01:02.948358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.821661 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" exitCode=0 Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.821741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.822156 4885 scope.go:117] "RemoveContainer" containerID="b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.822702 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:03 crc kubenswrapper[4885]: E0308 20:01:03.822991 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:16 crc kubenswrapper[4885]: I0308 20:01:16.368414 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:16 crc kubenswrapper[4885]: E0308 20:01:16.369090 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:30 crc kubenswrapper[4885]: I0308 20:01:30.368853 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:30 crc kubenswrapper[4885]: E0308 20:01:30.369845 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:43 crc kubenswrapper[4885]: I0308 20:01:43.368581 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:43 crc kubenswrapper[4885]: E0308 20:01:43.369686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:56 crc kubenswrapper[4885]: I0308 20:01:56.369278 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:56 crc kubenswrapper[4885]: E0308 20:01:56.370220 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:59 crc kubenswrapper[4885]: I0308 20:01:59.784421 4885 scope.go:117] "RemoveContainer" containerID="52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.162826 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: E0308 20:02:00.163782 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.163815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: E0308 20:02:00.163845 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.163858 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.164263 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.164318 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.165281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.168348 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.170641 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.174538 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.177200 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.199627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.301121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.323964 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.498617 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.794457 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: W0308 20:02:00.804779 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb49670_90cc_4dfa_9e28_1ae44ed1104b.slice/crio-a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd WatchSource:0}: Error finding container a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd: Status 404 returned error can't find the container with id a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd Mar 08 20:02:01 crc kubenswrapper[4885]: I0308 20:02:01.339097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerStarted","Data":"a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd"} Mar 08 20:02:02 crc kubenswrapper[4885]: I0308 20:02:02.350377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerStarted","Data":"85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9"} Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.383753 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerID="85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9" exitCode=0 Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.392088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerDied","Data":"85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9"} Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.731207 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.756282 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.777273 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl" (OuterVolumeSpecName: "kube-api-access-zc7pl") pod "cdb49670-90cc-4dfa-9e28-1ae44ed1104b" (UID: "cdb49670-90cc-4dfa-9e28-1ae44ed1104b"). InnerVolumeSpecName "kube-api-access-zc7pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.857700 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") on node \"crc\" DevicePath \"\"" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.397944 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerDied","Data":"a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd"} Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.398007 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.398028 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.809753 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.820333 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 20:02:05 crc kubenswrapper[4885]: I0308 20:02:05.383254 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" path="/var/lib/kubelet/pods/5d9eec97-1b10-4d6a-9787-e137d3c37dec/volumes" Mar 08 20:02:10 crc kubenswrapper[4885]: I0308 20:02:10.368504 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:10 crc kubenswrapper[4885]: E0308 20:02:10.369482 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:22 crc kubenswrapper[4885]: I0308 20:02:22.368374 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:22 crc kubenswrapper[4885]: E0308 20:02:22.369742 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:33 crc kubenswrapper[4885]: I0308 20:02:33.368813 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:33 crc kubenswrapper[4885]: E0308 20:02:33.369778 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:47 crc kubenswrapper[4885]: I0308 20:02:47.368550 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:47 crc kubenswrapper[4885]: E0308 20:02:47.369505 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:59 crc kubenswrapper[4885]: I0308 20:02:59.879525 4885 scope.go:117] "RemoveContainer" containerID="5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49" Mar 08 20:03:01 crc kubenswrapper[4885]: I0308 20:03:01.369043 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:01 crc kubenswrapper[4885]: E0308 20:03:01.369698 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:13 crc kubenswrapper[4885]: I0308 20:03:13.368665 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:13 crc kubenswrapper[4885]: E0308 20:03:13.369791 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:21 crc kubenswrapper[4885]: I0308 20:03:21.310161 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.82:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 20:03:28 crc kubenswrapper[4885]: I0308 20:03:28.369189 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:28 crc kubenswrapper[4885]: E0308 20:03:28.370145 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:39 crc kubenswrapper[4885]: I0308 20:03:39.375672 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:39 crc kubenswrapper[4885]: E0308 20:03:39.376721 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.062146 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: E0308 20:03:47.063176 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.063198 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.063471 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.065220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.078250 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.123971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.124038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.124112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225417 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.226119 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.226126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.250723 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.256990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.259208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.268002 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327249 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.397889 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.428708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429073 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429495 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.445518 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.613609 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.644984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.932045 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: E0308 20:03:47.997983 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a5e492_4ddd_4229_b1fb_019fa71d2951.slice/crio-2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a5e492_4ddd_4229_b1fb_019fa71d2951.slice/crio-conmon-2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668661 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" exitCode=0 Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"7562d15c2fd135e779c5aa72239de4c7f9c7869128d1cb264d9a14a668809cf4"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671192 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" exitCode=0 Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671261 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"d49c4dba2cb2e10975f26f51e87613eadff9b443e9c9c89ec4745a20041c01b4"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671704 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.468600 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.469907 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.475881 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.641990 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.643270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661889 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.662389 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.662458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.665913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.690940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.696071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.701535 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.763465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.764029 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.764272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.790600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865334 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865803 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.866269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.897956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.957761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.230706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:50 crc kubenswrapper[4885]: W0308 20:03:50.478379 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fe0b53_ae94_4b6b_8a6d_1d1f68a52bf7.slice/crio-6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0 WatchSource:0}: Error finding container 6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0: Status 404 returned error can't find the container with id 6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.480077 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.710322 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.710518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713438 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"1d0f0c8555c4b252c5081e0c34f2d3afd49d81f6583bcf2c95de0ec400208b12"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.717692 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.717756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722892 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.368660 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:51 crc kubenswrapper[4885]: E0308 20:03:51.369569 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.745273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.747910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.751973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.754283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.777055 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cqvqv" podStartSLOduration=2.116615552 podStartE2EDuration="4.777029414s" podCreationTimestamp="2026-03-08 20:03:47 +0000 UTC" firstStartedPulling="2026-03-08 20:03:48.673347997 +0000 UTC m=+1930.069402060" lastFinishedPulling="2026-03-08 20:03:51.333761899 +0000 UTC m=+1932.729815922" observedRunningTime="2026-03-08 20:03:51.76522529 +0000 UTC m=+1933.161279323" watchObservedRunningTime="2026-03-08 20:03:51.777029414 +0000 UTC m=+1933.173083437" Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.835963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ptr9" podStartSLOduration=2.322101759 podStartE2EDuration="4.835922622s" podCreationTimestamp="2026-03-08 20:03:47 +0000 UTC" firstStartedPulling="2026-03-08 20:03:48.67125566 +0000 UTC m=+1930.067309723" lastFinishedPulling="2026-03-08 20:03:51.185076563 +0000 UTC m=+1932.581130586" observedRunningTime="2026-03-08 20:03:51.820258434 +0000 UTC m=+1933.216312457" watchObservedRunningTime="2026-03-08 20:03:51.835922622 +0000 UTC m=+1933.231976655" Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.768883 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" exitCode=0 Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.768952 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.770562 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" exitCode=0 Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.770825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.782481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.785409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.805818 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84srk" podStartSLOduration=2.35787055 podStartE2EDuration="4.805797249s" podCreationTimestamp="2026-03-08 20:03:49 +0000 UTC" firstStartedPulling="2026-03-08 20:03:50.725124183 +0000 UTC m=+1932.121178216" lastFinishedPulling="2026-03-08 20:03:53.173050852 +0000 UTC m=+1934.569104915" observedRunningTime="2026-03-08 20:03:53.803245151 +0000 UTC m=+1935.199299194" watchObservedRunningTime="2026-03-08 20:03:53.805797249 +0000 UTC m=+1935.201851292" Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.837847 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dn4kh" podStartSLOduration=2.343723174 podStartE2EDuration="4.837829641s" podCreationTimestamp="2026-03-08 20:03:49 +0000 UTC" firstStartedPulling="2026-03-08 20:03:50.714695186 +0000 UTC m=+1932.110749219" lastFinishedPulling="2026-03-08 20:03:53.208801663 +0000 UTC m=+1934.604855686" observedRunningTime="2026-03-08 20:03:53.830498636 +0000 UTC m=+1935.226552689" watchObservedRunningTime="2026-03-08 20:03:53.837829641 +0000 UTC m=+1935.233883674" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.398548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.398946 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.477436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.614599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.614696 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.688784 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.879190 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.891592 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.791474 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.791827 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.855719 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.856143 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ptr9" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" containerID="cri-o://2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" gracePeriod=2 Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.959269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.959347 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.043612 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.044119 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cqvqv" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" containerID="cri-o://e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" gracePeriod=2 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.047122 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.142092 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.143356 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.145066 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.147260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.155875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.158021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.231183 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.327617 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.333901 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.407347 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.435780 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.435956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.436010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.437325 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities" (OuterVolumeSpecName: "utilities") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.441195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k" (OuterVolumeSpecName: "kube-api-access-prh5k") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "kube-api-access-prh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.484206 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.537529 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.537558 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.842706 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" exitCode=0 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.843634 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"7562d15c2fd135e779c5aa72239de4c7f9c7869128d1cb264d9a14a668809cf4"} Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844110 4885 scope.go:117] "RemoveContainer" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.859727 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dn4kh" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" probeResult="failure" output=< Mar 08 20:04:00 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:04:00 crc kubenswrapper[4885]: > Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.875497 4885 scope.go:117] "RemoveContainer" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.889892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.902375 4885 scope.go:117] "RemoveContainer" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: W0308 20:04:00.909412 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb24ca97d_74ff_4a8d_9621_40d03e8be6cc.slice/crio-0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297 WatchSource:0}: Error finding container 0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297: Status 404 returned error can't find the container with id 0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.910889 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.924299 4885 scope.go:117] "RemoveContainer" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.925325 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": container with ID starting with 2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a not found: ID does not exist" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925398 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} err="failed to get container status \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": rpc error: code = NotFound desc = could not find container \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": container with ID starting with 2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a not found: ID does not exist" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925446 4885 scope.go:117] "RemoveContainer" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.925907 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": container with ID starting with 99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c not found: ID does not exist" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925969 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} err="failed to get container status \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": rpc error: code = NotFound desc = could not find container \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": container with ID starting with 99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c not found: ID does not exist" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.926005 4885 scope.go:117] "RemoveContainer" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.926315 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": container with ID starting with 2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2 not found: ID does not exist" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.926344 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2"} err="failed to get container status \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": rpc error: code = NotFound desc = could not find container \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": container with ID starting with 2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.036236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.044821 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.224818 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.231434 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.393755 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" path="/var/lib/kubelet/pods/f7a5e492-4ddd-4229-b1fb-019fa71d2951/volumes" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.482440 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556089 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.557347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities" (OuterVolumeSpecName: "utilities") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.561302 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc" (OuterVolumeSpecName: "kube-api-access-zrbzc") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "kube-api-access-zrbzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.613679 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658225 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658280 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658290 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.867964 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" exitCode=0 Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868100 4885 scope.go:117] "RemoveContainer" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"d49c4dba2cb2e10975f26f51e87613eadff9b443e9c9c89ec4745a20041c01b4"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.870433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerStarted","Data":"0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.904043 4885 scope.go:117] "RemoveContainer" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.919739 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.933383 4885 scope.go:117] "RemoveContainer" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.933474 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959488 4885 scope.go:117] "RemoveContainer" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.959898 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": container with ID starting with e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7 not found: ID does not exist" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959954 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} err="failed to get container status \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": rpc error: code = NotFound desc = could not find container \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": container with ID starting with e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959979 4885 scope.go:117] "RemoveContainer" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.960274 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": container with ID starting with 9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941 not found: ID does not exist" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960304 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} err="failed to get container status \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": rpc error: code = NotFound desc = could not find container \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": container with ID starting with 9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960342 4885 scope.go:117] "RemoveContainer" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.960597 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": container with ID starting with 22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326 not found: ID does not exist" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960624 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326"} err="failed to get container status \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": rpc error: code = NotFound desc = could not find container \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": container with ID starting with 22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326 not found: ID does not exist" Mar 08 20:04:02 crc kubenswrapper[4885]: I0308 20:04:02.886455 4885 generic.go:334] "Generic (PLEG): container finished" podID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerID="59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66" exitCode=0 Mar 08 20:04:02 crc kubenswrapper[4885]: I0308 20:04:02.886512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerDied","Data":"59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66"} Mar 08 20:04:03 crc kubenswrapper[4885]: I0308 20:04:03.399502 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" path="/var/lib/kubelet/pods/2ffd5483-5bf1-4ca1-945d-5de49426ee21/volumes" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.299157 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.409537 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.418230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs" (OuterVolumeSpecName: "kube-api-access-p8gzs") pod "b24ca97d-74ff-4a8d-9621-40d03e8be6cc" (UID: "b24ca97d-74ff-4a8d-9621-40d03e8be6cc"). InnerVolumeSpecName "kube-api-access-p8gzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.511104 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerDied","Data":"0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297"} Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913283 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913369 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.048548 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.048974 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84srk" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" containerID="cri-o://76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" gracePeriod=2 Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.369834 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:05 crc kubenswrapper[4885]: E0308 20:04:05.370228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.399007 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.403709 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.619697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.734984 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities" (OuterVolumeSpecName: "utilities") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.739088 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb" (OuterVolumeSpecName: "kube-api-access-mbpfb") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "kube-api-access-mbpfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.800443 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832341 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832371 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924230 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" exitCode=0 Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924276 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0"} Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924433 4885 scope.go:117] "RemoveContainer" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.951393 4885 scope.go:117] "RemoveContainer" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.973668 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.986370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.988729 4885 scope.go:117] "RemoveContainer" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033156 4885 scope.go:117] "RemoveContainer" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.033612 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": container with ID starting with 76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780 not found: ID does not exist" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033640 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} err="failed to get container status \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": rpc error: code = NotFound desc = could not find container \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": container with ID starting with 76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780 not found: ID does not exist" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033658 4885 scope.go:117] "RemoveContainer" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.033981 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": container with ID starting with 68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60 not found: ID does not exist" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033998 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} err="failed to get container status \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": rpc error: code = NotFound desc = could not find container \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": container with ID starting with 68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60 not found: ID does not exist" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.034013 4885 scope.go:117] "RemoveContainer" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.034258 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": container with ID starting with e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6 not found: ID does not exist" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.034276 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6"} err="failed to get container status \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": rpc error: code = NotFound desc = could not find container \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": container with ID starting with e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6 not found: ID does not exist" Mar 08 20:04:07 crc kubenswrapper[4885]: I0308 20:04:07.384951 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" path="/var/lib/kubelet/pods/0f78a7ad-7933-489d-8395-4bb334007a30/volumes" Mar 08 20:04:07 crc kubenswrapper[4885]: I0308 20:04:07.386094 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" path="/var/lib/kubelet/pods/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7/volumes" Mar 08 20:04:09 crc kubenswrapper[4885]: I0308 20:04:09.872603 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:09 crc kubenswrapper[4885]: I0308 20:04:09.948743 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:10 crc kubenswrapper[4885]: I0308 20:04:10.125346 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:10 crc kubenswrapper[4885]: I0308 20:04:10.991272 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dn4kh" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" containerID="cri-o://c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" gracePeriod=2 Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.435691 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528833 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.531036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities" (OuterVolumeSpecName: "utilities") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.541281 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n" (OuterVolumeSpecName: "kube-api-access-j7n4n") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "kube-api-access-j7n4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.629945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.629976 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.732910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.832766 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019366 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" exitCode=0 Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019553 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019607 4885 scope.go:117] "RemoveContainer" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019581 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"1d0f0c8555c4b252c5081e0c34f2d3afd49d81f6583bcf2c95de0ec400208b12"} Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.066449 4885 scope.go:117] "RemoveContainer" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.069773 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.076596 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.102574 4885 scope.go:117] "RemoveContainer" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.134320 4885 scope.go:117] "RemoveContainer" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.134943 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": container with ID starting with c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b not found: ID does not exist" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.134984 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} err="failed to get container status \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": rpc error: code = NotFound desc = could not find container \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": container with ID starting with c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b not found: ID does not exist" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.135013 4885 scope.go:117] "RemoveContainer" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.135627 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": container with ID starting with 6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164 not found: ID does not exist" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.135811 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} err="failed to get container status \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": rpc error: code = NotFound desc = could not find container \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": container with ID starting with 6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164 not found: ID does not exist" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.136014 4885 scope.go:117] "RemoveContainer" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.136573 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": container with ID starting with df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853 not found: ID does not exist" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.136645 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853"} err="failed to get container status \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": rpc error: code = NotFound desc = could not find container \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": container with ID starting with df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853 not found: ID does not exist" Mar 08 20:04:13 crc kubenswrapper[4885]: I0308 20:04:13.385564 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" path="/var/lib/kubelet/pods/14843e03-07ac-482d-b2f6-6bbfb7567b91/volumes" Mar 08 20:04:16 crc kubenswrapper[4885]: I0308 20:04:16.367649 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:16 crc kubenswrapper[4885]: E0308 20:04:16.368148 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:27 crc kubenswrapper[4885]: I0308 20:04:27.369909 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:27 crc kubenswrapper[4885]: E0308 20:04:27.370854 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:39 crc kubenswrapper[4885]: I0308 20:04:39.376739 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:39 crc kubenswrapper[4885]: E0308 20:04:39.377974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:53 crc kubenswrapper[4885]: I0308 20:04:53.368780 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:53 crc kubenswrapper[4885]: E0308 20:04:53.370087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:00 crc kubenswrapper[4885]: I0308 20:05:00.014419 4885 scope.go:117] "RemoveContainer" containerID="f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa" Mar 08 20:05:08 crc kubenswrapper[4885]: I0308 20:05:08.368991 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:08 crc kubenswrapper[4885]: E0308 20:05:08.370116 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:20 crc kubenswrapper[4885]: I0308 20:05:20.369020 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:20 crc kubenswrapper[4885]: E0308 20:05:20.370103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:34 crc kubenswrapper[4885]: I0308 20:05:34.368580 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:34 crc kubenswrapper[4885]: E0308 20:05:34.369585 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:45 crc kubenswrapper[4885]: I0308 20:05:45.368075 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:45 crc kubenswrapper[4885]: E0308 20:05:45.369102 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:56 crc kubenswrapper[4885]: I0308 20:05:56.368881 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:56 crc kubenswrapper[4885]: E0308 20:05:56.370119 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.158967 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159576 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159629 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159639 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159650 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159657 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159701 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159711 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159727 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159739 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159757 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159766 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159789 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159803 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159813 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159832 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159844 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159866 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159875 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159913 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159948 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160221 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160266 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160285 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160303 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.161156 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167808 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167861 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.175232 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.244584 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.346186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.367684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.488614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:01 crc kubenswrapper[4885]: I0308 20:06:01.025055 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:01 crc kubenswrapper[4885]: I0308 20:06:01.166838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerStarted","Data":"8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d"} Mar 08 20:06:03 crc kubenswrapper[4885]: I0308 20:06:03.187549 4885 generic.go:334] "Generic (PLEG): container finished" podID="f2563b40-3861-46cb-b313-1c221b526aa7" containerID="221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a" exitCode=0 Mar 08 20:06:03 crc kubenswrapper[4885]: I0308 20:06:03.187904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerDied","Data":"221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a"} Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.615663 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.722432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"f2563b40-3861-46cb-b313-1c221b526aa7\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.729272 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc" (OuterVolumeSpecName: "kube-api-access-5t4dc") pod "f2563b40-3861-46cb-b313-1c221b526aa7" (UID: "f2563b40-3861-46cb-b313-1c221b526aa7"). InnerVolumeSpecName "kube-api-access-5t4dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.824199 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") on node \"crc\" DevicePath \"\"" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220201 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerDied","Data":"8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d"} Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220719 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220273 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.711698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.725943 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:06:07 crc kubenswrapper[4885]: I0308 20:06:07.383721 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" path="/var/lib/kubelet/pods/d589285c-60a3-4871-9149-7f1f99fc35ee/volumes" Mar 08 20:06:08 crc kubenswrapper[4885]: I0308 20:06:08.368976 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:06:09 crc kubenswrapper[4885]: I0308 20:06:09.261143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} Mar 08 20:07:00 crc kubenswrapper[4885]: I0308 20:07:00.187133 4885 scope.go:117] "RemoveContainer" containerID="a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.159380 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: E0308 20:08:00.160373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.160418 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.160686 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.161392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.165556 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.168343 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.168349 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.183580 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.204297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.305847 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.338947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.510693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.835094 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: W0308 20:08:00.837978 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda693b4d7_ae29_482d_8b4d_8025be4ce19f.slice/crio-a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5 WatchSource:0}: Error finding container a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5: Status 404 returned error can't find the container with id a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5 Mar 08 20:08:01 crc kubenswrapper[4885]: I0308 20:08:01.305431 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerStarted","Data":"a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5"} Mar 08 20:08:02 crc kubenswrapper[4885]: I0308 20:08:02.320810 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerStarted","Data":"e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef"} Mar 08 20:08:02 crc kubenswrapper[4885]: I0308 20:08:02.350817 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550008-v776t" podStartSLOduration=1.307888983 podStartE2EDuration="2.350795935s" podCreationTimestamp="2026-03-08 20:08:00 +0000 UTC" firstStartedPulling="2026-03-08 20:08:00.842564552 +0000 UTC m=+2182.238618605" lastFinishedPulling="2026-03-08 20:08:01.885471524 +0000 UTC m=+2183.281525557" observedRunningTime="2026-03-08 20:08:02.34345708 +0000 UTC m=+2183.739511143" watchObservedRunningTime="2026-03-08 20:08:02.350795935 +0000 UTC m=+2183.746849978" Mar 08 20:08:03 crc kubenswrapper[4885]: I0308 20:08:03.335143 4885 generic.go:334] "Generic (PLEG): container finished" podID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerID="e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef" exitCode=0 Mar 08 20:08:03 crc kubenswrapper[4885]: I0308 20:08:03.335220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerDied","Data":"e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef"} Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.709771 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.881785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.891019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b" (OuterVolumeSpecName: "kube-api-access-ffb4b") pod "a693b4d7-ae29-482d-8b4d-8025be4ce19f" (UID: "a693b4d7-ae29-482d-8b4d-8025be4ce19f"). InnerVolumeSpecName "kube-api-access-ffb4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.984967 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") on node \"crc\" DevicePath \"\"" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerDied","Data":"a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5"} Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358198 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358278 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.447094 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.457694 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:08:07 crc kubenswrapper[4885]: I0308 20:08:07.385442 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" path="/var/lib/kubelet/pods/cdb49670-90cc-4dfa-9e28-1ae44ed1104b/volumes" Mar 08 20:08:32 crc kubenswrapper[4885]: I0308 20:08:32.818406 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:08:32 crc kubenswrapper[4885]: I0308 20:08:32.819070 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:00 crc kubenswrapper[4885]: I0308 20:09:00.310665 4885 scope.go:117] "RemoveContainer" containerID="85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9" Mar 08 20:09:02 crc kubenswrapper[4885]: I0308 20:09:02.818672 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:09:02 crc kubenswrapper[4885]: I0308 20:09:02.819648 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.818532 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820048 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820134 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820997 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.821094 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" gracePeriod=600 Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137433 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" exitCode=0 Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137972 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:09:34 crc kubenswrapper[4885]: I0308 20:09:34.151792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.167217 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:00 crc kubenswrapper[4885]: E0308 20:10:00.168248 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.168270 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.168552 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.169268 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.172112 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.173187 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.173241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.177412 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.306218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.407965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.447792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.500315 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.062495 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.064211 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.429360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerStarted","Data":"3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5"} Mar 08 20:10:02 crc kubenswrapper[4885]: I0308 20:10:02.438743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerStarted","Data":"a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4"} Mar 08 20:10:02 crc kubenswrapper[4885]: I0308 20:10:02.461719 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550010-xflbc" podStartSLOduration=1.559694219 podStartE2EDuration="2.461687923s" podCreationTimestamp="2026-03-08 20:10:00 +0000 UTC" firstStartedPulling="2026-03-08 20:10:01.061662375 +0000 UTC m=+2302.457716438" lastFinishedPulling="2026-03-08 20:10:01.963656119 +0000 UTC m=+2303.359710142" observedRunningTime="2026-03-08 20:10:02.458467317 +0000 UTC m=+2303.854521380" watchObservedRunningTime="2026-03-08 20:10:02.461687923 +0000 UTC m=+2303.857742006" Mar 08 20:10:03 crc kubenswrapper[4885]: I0308 20:10:03.449649 4885 generic.go:334] "Generic (PLEG): container finished" podID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerID="a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4" exitCode=0 Mar 08 20:10:03 crc kubenswrapper[4885]: I0308 20:10:03.449710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerDied","Data":"a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4"} Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.841434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.982007 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"14bc9568-8018-41b0-9d6f-1b71feaa1021\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.989564 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j" (OuterVolumeSpecName: "kube-api-access-r2b5j") pod "14bc9568-8018-41b0-9d6f-1b71feaa1021" (UID: "14bc9568-8018-41b0-9d6f-1b71feaa1021"). InnerVolumeSpecName "kube-api-access-r2b5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.083774 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") on node \"crc\" DevicePath \"\"" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473557 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerDied","Data":"3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5"} Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473633 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.528996 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.534868 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:10:07 crc kubenswrapper[4885]: I0308 20:10:07.385164 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" path="/var/lib/kubelet/pods/b24ca97d-74ff-4a8d-9621-40d03e8be6cc/volumes" Mar 08 20:11:00 crc kubenswrapper[4885]: I0308 20:11:00.417253 4885 scope.go:117] "RemoveContainer" containerID="59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.166917 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:00 crc kubenswrapper[4885]: E0308 20:12:00.169432 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.169474 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.169652 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.170140 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.173712 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.174042 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.174381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.176902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.215553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.317474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.337473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.515446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.998300 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:01 crc kubenswrapper[4885]: I0308 20:12:01.603500 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerStarted","Data":"7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b"} Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.613280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerStarted","Data":"88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82"} Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.638109 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" podStartSLOduration=1.527782276 podStartE2EDuration="2.638090074s" podCreationTimestamp="2026-03-08 20:12:00 +0000 UTC" firstStartedPulling="2026-03-08 20:12:01.016163011 +0000 UTC m=+2422.412217054" lastFinishedPulling="2026-03-08 20:12:02.126470799 +0000 UTC m=+2423.522524852" observedRunningTime="2026-03-08 20:12:02.633283436 +0000 UTC m=+2424.029337499" watchObservedRunningTime="2026-03-08 20:12:02.638090074 +0000 UTC m=+2424.034144107" Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.818660 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.818743 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:12:03 crc kubenswrapper[4885]: I0308 20:12:03.626735 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerID="88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82" exitCode=0 Mar 08 20:12:03 crc kubenswrapper[4885]: I0308 20:12:03.626802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerDied","Data":"88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82"} Mar 08 20:12:04 crc kubenswrapper[4885]: I0308 20:12:04.992322 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.093278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.099995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn" (OuterVolumeSpecName: "kube-api-access-stkkn") pod "8c0dd782-7d55-432a-a4c4-72eab3a342f0" (UID: "8c0dd782-7d55-432a-a4c4-72eab3a342f0"). InnerVolumeSpecName "kube-api-access-stkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.195190 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") on node \"crc\" DevicePath \"\"" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.649791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerDied","Data":"7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b"} Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.650178 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.649915 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.722265 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.734224 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:12:07 crc kubenswrapper[4885]: I0308 20:12:07.401860 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" path="/var/lib/kubelet/pods/f2563b40-3861-46cb-b313-1c221b526aa7/volumes" Mar 08 20:12:32 crc kubenswrapper[4885]: I0308 20:12:32.818172 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:12:32 crc kubenswrapper[4885]: I0308 20:12:32.818997 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:13:00 crc kubenswrapper[4885]: I0308 20:13:00.536764 4885 scope.go:117] "RemoveContainer" containerID="221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818358 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818778 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818825 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.819452 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.819509 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" gracePeriod=600 Mar 08 20:13:02 crc kubenswrapper[4885]: E0308 20:13:02.952834 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184460 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" exitCode=0 Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184538 4885 scope.go:117] "RemoveContainer" containerID="b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184947 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:03 crc kubenswrapper[4885]: E0308 20:13:03.185152 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:17 crc kubenswrapper[4885]: I0308 20:13:17.407956 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:17 crc kubenswrapper[4885]: E0308 20:13:17.408702 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:31 crc kubenswrapper[4885]: I0308 20:13:31.368736 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:31 crc kubenswrapper[4885]: E0308 20:13:31.370152 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:45 crc kubenswrapper[4885]: I0308 20:13:45.368548 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:45 crc kubenswrapper[4885]: E0308 20:13:45.369708 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:56 crc kubenswrapper[4885]: I0308 20:13:56.368758 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:56 crc kubenswrapper[4885]: E0308 20:13:56.369616 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.167477 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:00 crc kubenswrapper[4885]: E0308 20:14:00.167889 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.167910 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.168214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.169442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173542 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173730 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173730 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.193647 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.321428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.423821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.445420 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.515112 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.999254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.521752 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.523933 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.539468 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.642603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.642941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643073 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.669956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.760885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerStarted","Data":"8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0"} Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.855062 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.094029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:02 crc kubenswrapper[4885]: W0308 20:14:02.101195 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa818349_b932_4e77_a8c6_6d200c15e61f.slice/crio-d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000 WatchSource:0}: Error finding container d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000: Status 404 returned error can't find the container with id d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000 Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773900 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" exitCode=0 Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7"} Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerStarted","Data":"d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000"} Mar 08 20:14:03 crc kubenswrapper[4885]: I0308 20:14:03.789678 4885 generic.go:334] "Generic (PLEG): container finished" podID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerID="c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78" exitCode=0 Mar 08 20:14:03 crc kubenswrapper[4885]: I0308 20:14:03.789762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerDied","Data":"c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78"} Mar 08 20:14:04 crc kubenswrapper[4885]: I0308 20:14:04.805232 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" exitCode=0 Mar 08 20:14:04 crc kubenswrapper[4885]: I0308 20:14:04.805304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.221006 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.399415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"fba06a47-da5c-44a3-9184-d2d92d14ce91\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.420029 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td" (OuterVolumeSpecName: "kube-api-access-576td") pod "fba06a47-da5c-44a3-9184-d2d92d14ce91" (UID: "fba06a47-da5c-44a3-9184-d2d92d14ce91"). InnerVolumeSpecName "kube-api-access-576td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.502235 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.816808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerDied","Data":"8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.817199 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.816834 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.820947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerStarted","Data":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.857529 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnwk6" podStartSLOduration=2.405304759 podStartE2EDuration="4.857500226s" podCreationTimestamp="2026-03-08 20:14:01 +0000 UTC" firstStartedPulling="2026-03-08 20:14:02.776412303 +0000 UTC m=+2544.172466326" lastFinishedPulling="2026-03-08 20:14:05.22860773 +0000 UTC m=+2546.624661793" observedRunningTime="2026-03-08 20:14:05.844652914 +0000 UTC m=+2547.240706977" watchObservedRunningTime="2026-03-08 20:14:05.857500226 +0000 UTC m=+2547.253554289" Mar 08 20:14:06 crc kubenswrapper[4885]: I0308 20:14:06.347765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:14:06 crc kubenswrapper[4885]: I0308 20:14:06.360160 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:14:07 crc kubenswrapper[4885]: I0308 20:14:07.384480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" path="/var/lib/kubelet/pods/a693b4d7-ae29-482d-8b4d-8025be4ce19f/volumes" Mar 08 20:14:08 crc kubenswrapper[4885]: I0308 20:14:08.369225 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:08 crc kubenswrapper[4885]: E0308 20:14:08.369603 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:11 crc kubenswrapper[4885]: I0308 20:14:11.855868 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:11 crc kubenswrapper[4885]: I0308 20:14:11.856374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:12 crc kubenswrapper[4885]: I0308 20:14:12.929352 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnwk6" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" probeResult="failure" output=< Mar 08 20:14:12 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:14:12 crc kubenswrapper[4885]: > Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.693340 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:13 crc kubenswrapper[4885]: E0308 20:14:13.693796 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.693825 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.694099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.695777 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.728123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743734 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844456 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.845171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.845190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.862872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.079843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.573335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.896101 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.896461 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"7b14c100c2776453655789b80e53f3f167965cd1849e65873687ab218eeb5a83"} Mar 08 20:14:15 crc kubenswrapper[4885]: I0308 20:14:15.910966 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" exitCode=0 Mar 08 20:14:15 crc kubenswrapper[4885]: I0308 20:14:15.911053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} Mar 08 20:14:16 crc kubenswrapper[4885]: I0308 20:14:16.923401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} Mar 08 20:14:17 crc kubenswrapper[4885]: I0308 20:14:17.937152 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" exitCode=0 Mar 08 20:14:17 crc kubenswrapper[4885]: I0308 20:14:17.937205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} Mar 08 20:14:18 crc kubenswrapper[4885]: I0308 20:14:18.948403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} Mar 08 20:14:18 crc kubenswrapper[4885]: I0308 20:14:18.993482 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4qwhh" podStartSLOduration=3.563928367 podStartE2EDuration="5.993446121s" podCreationTimestamp="2026-03-08 20:14:13 +0000 UTC" firstStartedPulling="2026-03-08 20:14:15.914505744 +0000 UTC m=+2557.310559807" lastFinishedPulling="2026-03-08 20:14:18.344023528 +0000 UTC m=+2559.740077561" observedRunningTime="2026-03-08 20:14:18.975777231 +0000 UTC m=+2560.371831314" watchObservedRunningTime="2026-03-08 20:14:18.993446121 +0000 UTC m=+2560.389500184" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.370117 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:21 crc kubenswrapper[4885]: E0308 20:14:21.371382 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.914784 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.975782 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:22 crc kubenswrapper[4885]: I0308 20:14:22.880325 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:22 crc kubenswrapper[4885]: I0308 20:14:22.981280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnwk6" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" containerID="cri-o://c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" gracePeriod=2 Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.441865 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623694 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623942 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.624977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities" (OuterVolumeSpecName: "utilities") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.635385 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv" (OuterVolumeSpecName: "kube-api-access-mhlgv") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "kube-api-access-mhlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.730345 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.730399 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.797956 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.832320 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.002886 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" exitCode=0 Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.002971 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000"} Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003051 4885 scope.go:117] "RemoveContainer" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003042 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.031070 4885 scope.go:117] "RemoveContainer" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.069534 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.075967 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081258 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081959 4885 scope.go:117] "RemoveContainer" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.123429 4885 scope.go:117] "RemoveContainer" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124000 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": container with ID starting with c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c not found: ID does not exist" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124045 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} err="failed to get container status \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": rpc error: code = NotFound desc = could not find container \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": container with ID starting with c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124079 4885 scope.go:117] "RemoveContainer" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124507 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": container with ID starting with aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f not found: ID does not exist" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124534 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f"} err="failed to get container status \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": rpc error: code = NotFound desc = could not find container \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": container with ID starting with aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124550 4885 scope.go:117] "RemoveContainer" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124785 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": container with ID starting with 9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7 not found: ID does not exist" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124855 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7"} err="failed to get container status \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": rpc error: code = NotFound desc = could not find container \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": container with ID starting with 9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7 not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.132334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:25 crc kubenswrapper[4885]: I0308 20:14:25.095595 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:25 crc kubenswrapper[4885]: I0308 20:14:25.385810 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" path="/var/lib/kubelet/pods/aa818349-b932-4e77-a8c6-6d200c15e61f/volumes" Mar 08 20:14:26 crc kubenswrapper[4885]: I0308 20:14:26.476671 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.034375 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4qwhh" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" containerID="cri-o://a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" gracePeriod=2 Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.556548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.694982 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.695122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.695163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.696701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities" (OuterVolumeSpecName: "utilities") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.704513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8" (OuterVolumeSpecName: "kube-api-access-xgwq8") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "kube-api-access-xgwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.796988 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.797044 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049271 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" exitCode=0 Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049332 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049364 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049422 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"7b14c100c2776453655789b80e53f3f167965cd1849e65873687ab218eeb5a83"} Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049457 4885 scope.go:117] "RemoveContainer" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.080993 4885 scope.go:117] "RemoveContainer" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.086955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.105605 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.110787 4885 scope.go:117] "RemoveContainer" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.139253 4885 scope.go:117] "RemoveContainer" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.139775 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": container with ID starting with a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6 not found: ID does not exist" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.139896 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} err="failed to get container status \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": rpc error: code = NotFound desc = could not find container \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": container with ID starting with a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140063 4885 scope.go:117] "RemoveContainer" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.140670 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": container with ID starting with a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01 not found: ID does not exist" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140721 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} err="failed to get container status \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": rpc error: code = NotFound desc = could not find container \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": container with ID starting with a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140748 4885 scope.go:117] "RemoveContainer" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.141487 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": container with ID starting with d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7 not found: ID does not exist" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.141529 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} err="failed to get container status \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": rpc error: code = NotFound desc = could not find container \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": container with ID starting with d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.410269 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.432011 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:29 crc kubenswrapper[4885]: I0308 20:14:29.404167 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" path="/var/lib/kubelet/pods/6258a7a2-e590-4201-a36b-26f438c35b4f/volumes" Mar 08 20:14:32 crc kubenswrapper[4885]: I0308 20:14:32.368572 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:32 crc kubenswrapper[4885]: E0308 20:14:32.369359 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815145 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815694 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815706 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815714 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815722 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815740 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815751 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815761 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815767 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815785 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815794 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815800 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815966 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815979 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.816997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.830400 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953865 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.055870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.055999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.077354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.153040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.615518 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:37 crc kubenswrapper[4885]: W0308 20:14:37.630549 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef92c765_f2e8_4808_a0dc_f9ccdb20509e.slice/crio-bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e WatchSource:0}: Error finding container bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e: Status 404 returned error can't find the container with id bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157551 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" exitCode=0 Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157607 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f"} Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e"} Mar 08 20:14:39 crc kubenswrapper[4885]: I0308 20:14:39.168409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} Mar 08 20:14:40 crc kubenswrapper[4885]: I0308 20:14:40.182971 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" exitCode=0 Mar 08 20:14:40 crc kubenswrapper[4885]: I0308 20:14:40.183040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} Mar 08 20:14:41 crc kubenswrapper[4885]: I0308 20:14:41.201112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} Mar 08 20:14:41 crc kubenswrapper[4885]: I0308 20:14:41.240694 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhpk5" podStartSLOduration=2.784750011 podStartE2EDuration="5.240634217s" podCreationTimestamp="2026-03-08 20:14:36 +0000 UTC" firstStartedPulling="2026-03-08 20:14:38.159755168 +0000 UTC m=+2579.555809231" lastFinishedPulling="2026-03-08 20:14:40.615639374 +0000 UTC m=+2582.011693437" observedRunningTime="2026-03-08 20:14:41.230959689 +0000 UTC m=+2582.627013742" watchObservedRunningTime="2026-03-08 20:14:41.240634217 +0000 UTC m=+2582.636688280" Mar 08 20:14:45 crc kubenswrapper[4885]: I0308 20:14:45.369196 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:45 crc kubenswrapper[4885]: E0308 20:14:45.370583 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.153901 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.155865 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.243798 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.329709 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.492438 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:49 crc kubenswrapper[4885]: I0308 20:14:49.274156 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mhpk5" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" containerID="cri-o://c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" gracePeriod=2 Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.267183 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290555 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" exitCode=0 Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290657 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290678 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e"} Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290710 4885 scope.go:117] "RemoveContainer" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.316845 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.316992 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.317124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.319469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities" (OuterVolumeSpecName: "utilities") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.326054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp" (OuterVolumeSpecName: "kube-api-access-6kzlp") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "kube-api-access-6kzlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.331401 4885 scope.go:117] "RemoveContainer" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.357028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.359306 4885 scope.go:117] "RemoveContainer" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.414209 4885 scope.go:117] "RemoveContainer" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.415153 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": container with ID starting with c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786 not found: ID does not exist" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.415283 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} err="failed to get container status \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": rpc error: code = NotFound desc = could not find container \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": container with ID starting with c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786 not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.415341 4885 scope.go:117] "RemoveContainer" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.416117 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": container with ID starting with 0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88 not found: ID does not exist" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416198 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} err="failed to get container status \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": rpc error: code = NotFound desc = could not find container \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": container with ID starting with 0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88 not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416260 4885 scope.go:117] "RemoveContainer" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.416770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": container with ID starting with 9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f not found: ID does not exist" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416811 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f"} err="failed to get container status \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": rpc error: code = NotFound desc = could not find container \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": container with ID starting with 9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421012 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421042 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421055 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.649317 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.658944 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:51 crc kubenswrapper[4885]: I0308 20:14:51.386415 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" path="/var/lib/kubelet/pods/ef92c765-f2e8-4808-a0dc-f9ccdb20509e/volumes" Mar 08 20:14:57 crc kubenswrapper[4885]: I0308 20:14:57.368591 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:57 crc kubenswrapper[4885]: E0308 20:14:57.369620 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.160887 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161234 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161251 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-content" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-content" Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161276 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-utilities" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161286 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-utilities" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161462 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161998 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.165832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.166256 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.173404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320585 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320822 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.424326 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.443390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.454105 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.484228 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.645244 4885 scope.go:117] "RemoveContainer" containerID="e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.793335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.390821 4885 generic.go:334] "Generic (PLEG): container finished" podID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerID="26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771" exitCode=0 Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.391076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerDied","Data":"26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771"} Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.391258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerStarted","Data":"c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e"} Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.829271 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959534 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959699 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959898 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.960910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.967781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.967791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx" (OuterVolumeSpecName: "kube-api-access-z9zzx") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "kube-api-access-z9zzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061723 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061790 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061807 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.416582 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.416478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerDied","Data":"c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e"} Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.417547 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.925743 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.932179 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 20:15:05 crc kubenswrapper[4885]: I0308 20:15:05.385062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" path="/var/lib/kubelet/pods/1329795d-a8f9-4896-adba-23c2c0da9261/volumes" Mar 08 20:15:09 crc kubenswrapper[4885]: I0308 20:15:09.376449 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:09 crc kubenswrapper[4885]: E0308 20:15:09.377562 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.172576 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:10 crc kubenswrapper[4885]: E0308 20:15:10.173351 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.173380 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.173627 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.175304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.198728 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.480794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.480999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481032 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481812 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.505813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.539993 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.065466 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501653 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" exitCode=0 Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f"} Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501751 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerStarted","Data":"b7621f432752e14ef81ea4741592219bed91c1151a2d13496f884aab8c9e039e"} Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.505309 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:15:13 crc kubenswrapper[4885]: I0308 20:15:13.524274 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" exitCode=0 Mar 08 20:15:13 crc kubenswrapper[4885]: I0308 20:15:13.524438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80"} Mar 08 20:15:14 crc kubenswrapper[4885]: I0308 20:15:14.536486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerStarted","Data":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} Mar 08 20:15:14 crc kubenswrapper[4885]: I0308 20:15:14.561979 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5bcbs" podStartSLOduration=2.147329028 podStartE2EDuration="4.561961177s" podCreationTimestamp="2026-03-08 20:15:10 +0000 UTC" firstStartedPulling="2026-03-08 20:15:11.50485714 +0000 UTC m=+2612.900911203" lastFinishedPulling="2026-03-08 20:15:13.919489289 +0000 UTC m=+2615.315543352" observedRunningTime="2026-03-08 20:15:14.557422916 +0000 UTC m=+2615.953476939" watchObservedRunningTime="2026-03-08 20:15:14.561961177 +0000 UTC m=+2615.958015190" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.541853 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.542682 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.607955 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.701856 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.891188 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:21 crc kubenswrapper[4885]: I0308 20:15:21.368495 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:21 crc kubenswrapper[4885]: E0308 20:15:21.369430 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:22 crc kubenswrapper[4885]: I0308 20:15:22.609583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5bcbs" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" containerID="cri-o://308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" gracePeriod=2 Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.169348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.248340 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.251171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.251355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.252333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities" (OuterVolumeSpecName: "utilities") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.257792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r" (OuterVolumeSpecName: "kube-api-access-2dc6r") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "kube-api-access-2dc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.298512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352765 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352793 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352803 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628185 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" exitCode=0 Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"b7621f432752e14ef81ea4741592219bed91c1151a2d13496f884aab8c9e039e"} Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628419 4885 scope.go:117] "RemoveContainer" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.672610 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.679360 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.679439 4885 scope.go:117] "RemoveContainer" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.711940 4885 scope.go:117] "RemoveContainer" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752022 4885 scope.go:117] "RemoveContainer" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.752755 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": container with ID starting with 308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14 not found: ID does not exist" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752818 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} err="failed to get container status \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": rpc error: code = NotFound desc = could not find container \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": container with ID starting with 308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14 not found: ID does not exist" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752848 4885 scope.go:117] "RemoveContainer" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.753460 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": container with ID starting with 2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80 not found: ID does not exist" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.753502 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80"} err="failed to get container status \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": rpc error: code = NotFound desc = could not find container \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": container with ID starting with 2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80 not found: ID does not exist" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.753528 4885 scope.go:117] "RemoveContainer" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.753968 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": container with ID starting with 65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f not found: ID does not exist" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.754049 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f"} err="failed to get container status \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": rpc error: code = NotFound desc = could not find container \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": container with ID starting with 65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f not found: ID does not exist" Mar 08 20:15:25 crc kubenswrapper[4885]: I0308 20:15:25.384543 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" path="/var/lib/kubelet/pods/c91aad57-6c3a-41b7-9844-b93c89a30127/volumes" Mar 08 20:15:32 crc kubenswrapper[4885]: I0308 20:15:32.368415 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:32 crc kubenswrapper[4885]: E0308 20:15:32.370259 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:47 crc kubenswrapper[4885]: I0308 20:15:47.369334 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:47 crc kubenswrapper[4885]: E0308 20:15:47.370297 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.162579 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-content" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163350 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-content" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163365 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163373 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163385 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-utilities" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163421 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-utilities" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163581 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.164148 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.169582 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.169798 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.170222 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.185503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.287908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.368382 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.368681 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.389850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.410592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.487374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.785813 4885 scope.go:117] "RemoveContainer" containerID="2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041" Mar 08 20:16:01 crc kubenswrapper[4885]: I0308 20:16:01.027436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:01 crc kubenswrapper[4885]: I0308 20:16:01.090462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerStarted","Data":"1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387"} Mar 08 20:16:03 crc kubenswrapper[4885]: I0308 20:16:03.108483 4885 generic.go:334] "Generic (PLEG): container finished" podID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerID="a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2" exitCode=0 Mar 08 20:16:03 crc kubenswrapper[4885]: I0308 20:16:03.108625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerDied","Data":"a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2"} Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.468384 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.561000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.569377 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt" (OuterVolumeSpecName: "kube-api-access-m58qt") pod "9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" (UID: "9d41fb78-094e-49ae-b3d1-4bd0dca17fc0"). InnerVolumeSpecName "kube-api-access-m58qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.662353 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") on node \"crc\" DevicePath \"\"" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.133552 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerDied","Data":"1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387"} Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.134013 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.134090 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.547519 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.555238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:16:07 crc kubenswrapper[4885]: I0308 20:16:07.382755 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" path="/var/lib/kubelet/pods/14bc9568-8018-41b0-9d6f-1b71feaa1021/volumes" Mar 08 20:16:14 crc kubenswrapper[4885]: I0308 20:16:14.368117 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:14 crc kubenswrapper[4885]: E0308 20:16:14.369263 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:29 crc kubenswrapper[4885]: I0308 20:16:29.376265 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:29 crc kubenswrapper[4885]: E0308 20:16:29.377285 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:42 crc kubenswrapper[4885]: I0308 20:16:42.368221 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:42 crc kubenswrapper[4885]: E0308 20:16:42.369777 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:56 crc kubenswrapper[4885]: I0308 20:16:56.368003 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:56 crc kubenswrapper[4885]: E0308 20:16:56.368767 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:00 crc kubenswrapper[4885]: I0308 20:17:00.869051 4885 scope.go:117] "RemoveContainer" containerID="a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4" Mar 08 20:17:09 crc kubenswrapper[4885]: I0308 20:17:09.375102 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:09 crc kubenswrapper[4885]: E0308 20:17:09.376184 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:23 crc kubenswrapper[4885]: I0308 20:17:23.369051 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:23 crc kubenswrapper[4885]: E0308 20:17:23.369658 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:36 crc kubenswrapper[4885]: I0308 20:17:36.369797 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:36 crc kubenswrapper[4885]: E0308 20:17:36.370831 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:47 crc kubenswrapper[4885]: I0308 20:17:47.369570 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:47 crc kubenswrapper[4885]: E0308 20:17:47.370747 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.183330 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:00 crc kubenswrapper[4885]: E0308 20:18:00.186107 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.186358 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.186734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.187881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.191049 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.191796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.192261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.229883 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.304717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.369519 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:18:00 crc kubenswrapper[4885]: E0308 20:18:00.369724 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.406262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.426362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.515722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:01 crc kubenswrapper[4885]: I0308 20:18:01.005590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:01 crc kubenswrapper[4885]: I0308 20:18:01.213001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerStarted","Data":"b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a"} Mar 08 20:18:03 crc kubenswrapper[4885]: I0308 20:18:03.235000 4885 generic.go:334] "Generic (PLEG): container finished" podID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerID="bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370" exitCode=0 Mar 08 20:18:03 crc kubenswrapper[4885]: I0308 20:18:03.235146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerDied","Data":"bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370"} Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.505832 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.593603 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"059c0df3-29c7-4def-970d-ba52e1884b8f\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.623152 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc" (OuterVolumeSpecName: "kube-api-access-r7hfc") pod "059c0df3-29c7-4def-970d-ba52e1884b8f" (UID: "059c0df3-29c7-4def-970d-ba52e1884b8f"). InnerVolumeSpecName "kube-api-access-r7hfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.695525 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") on node \"crc\" DevicePath \"\"" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerDied","Data":"b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a"} Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252794 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252750 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.593050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.598048 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:18:07 crc kubenswrapper[4885]: I0308 20:18:07.386968 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" path="/var/lib/kubelet/pods/8c0dd782-7d55-432a-a4c4-72eab3a342f0/volumes" Mar 08 20:18:11 crc kubenswrapper[4885]: I0308 20:18:11.368352 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:18:12 crc kubenswrapper[4885]: I0308 20:18:12.318570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} Mar 08 20:19:00 crc kubenswrapper[4885]: I0308 20:19:00.993615 4885 scope.go:117] "RemoveContainer" containerID="88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.163748 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:00 crc kubenswrapper[4885]: E0308 20:20:00.164891 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.164915 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.165364 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.166308 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170024 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170747 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.178665 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.315352 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.417366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.453207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.501145 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:01 crc kubenswrapper[4885]: I0308 20:20:01.012566 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:01 crc kubenswrapper[4885]: I0308 20:20:01.360254 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerStarted","Data":"209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec"} Mar 08 20:20:03 crc kubenswrapper[4885]: I0308 20:20:03.387179 4885 generic.go:334] "Generic (PLEG): container finished" podID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerID="c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476" exitCode=0 Mar 08 20:20:03 crc kubenswrapper[4885]: I0308 20:20:03.387261 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerDied","Data":"c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476"} Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.736248 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.885799 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.897174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4" (OuterVolumeSpecName: "kube-api-access-nwfx4") pod "c98e098b-03c5-49cd-8009-4d1dde33cd6d" (UID: "c98e098b-03c5-49cd-8009-4d1dde33cd6d"). InnerVolumeSpecName "kube-api-access-nwfx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.987973 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") on node \"crc\" DevicePath \"\"" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerDied","Data":"209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec"} Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417253 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.840856 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.851341 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:20:07 crc kubenswrapper[4885]: I0308 20:20:07.385400 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" path="/var/lib/kubelet/pods/fba06a47-da5c-44a3-9184-d2d92d14ce91/volumes" Mar 08 20:20:32 crc kubenswrapper[4885]: I0308 20:20:32.818052 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:20:32 crc kubenswrapper[4885]: I0308 20:20:32.818846 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:01 crc kubenswrapper[4885]: I0308 20:21:01.107559 4885 scope.go:117] "RemoveContainer" containerID="c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78" Mar 08 20:21:02 crc kubenswrapper[4885]: I0308 20:21:02.819031 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:21:02 crc kubenswrapper[4885]: I0308 20:21:02.819458 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.818802 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.819611 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.819671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.820845 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.820993 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" gracePeriod=600 Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.304358 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" exitCode=0 Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.304579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.305067 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.305179 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.154464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: E0308 20:22:00.155602 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.155645 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.155940 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.156450 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.162763 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.162869 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.163335 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.169980 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.276000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.377679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.407814 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.519326 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.786285 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.802542 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:22:01 crc kubenswrapper[4885]: I0308 20:22:01.575632 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerStarted","Data":"ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378"} Mar 08 20:22:02 crc kubenswrapper[4885]: I0308 20:22:02.585883 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerID="02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea" exitCode=0 Mar 08 20:22:02 crc kubenswrapper[4885]: I0308 20:22:02.586006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerDied","Data":"02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea"} Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.038736 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.135175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.140881 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp" (OuterVolumeSpecName: "kube-api-access-ccmqp") pod "9c1236a4-ac5f-4b66-8064-a0877ea3eb13" (UID: "9c1236a4-ac5f-4b66-8064-a0877ea3eb13"). InnerVolumeSpecName "kube-api-access-ccmqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.236402 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") on node \"crc\" DevicePath \"\"" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.617911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerDied","Data":"ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378"} Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.618002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.618083 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.129133 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.135647 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.382403 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" path="/var/lib/kubelet/pods/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0/volumes" Mar 08 20:23:01 crc kubenswrapper[4885]: I0308 20:23:01.226580 4885 scope.go:117] "RemoveContainer" containerID="a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.160518 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: E0308 20:24:00.161622 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.161646 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.161900 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.162663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166330 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166774 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166683 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.179713 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.323482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.425184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.459618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.496226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.803815 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.822399 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerStarted","Data":"ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0"} Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.819829 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.820327 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.844404 4885 generic.go:334] "Generic (PLEG): container finished" podID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerID="43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57" exitCode=0 Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.844488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerDied","Data":"43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57"} Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.240883 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.394719 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.405225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5" (OuterVolumeSpecName: "kube-api-access-xctl5") pod "1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" (UID: "1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104"). InnerVolumeSpecName "kube-api-access-xctl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.497229 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.865907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerDied","Data":"ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0"} Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.866291 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.866019 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.330965 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.340223 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.384646 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" path="/var/lib/kubelet/pods/059c0df3-29c7-4def-970d-ba52e1884b8f/volumes" Mar 08 20:24:32 crc kubenswrapper[4885]: I0308 20:24:32.818064 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:24:32 crc kubenswrapper[4885]: I0308 20:24:32.819690 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.227844 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:36 crc kubenswrapper[4885]: E0308 20:24:36.231143 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.231178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.231595 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.235645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.236089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368625 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.471051 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.471506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.512720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.564734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.049124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.213151 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.216629 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.232447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"d8abcb92c837d0e997506f5dff9884df30ccd03f8039a5bdc556e91073778235"} Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.259463 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.386905 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.387002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.387045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488120 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488164 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488769 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.508148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.552116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.976291 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: W0308 20:24:37.986606 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ef9b83_5e44_4dd4_917b_0cc2b22994a7.slice/crio-a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970 WatchSource:0}: Error finding container a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970: Status 404 returned error can't find the container with id a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242454 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" exitCode=0 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242593 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.245820 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" exitCode=0 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.245865 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.608415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.609771 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.626456 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.701867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.701973 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.702001 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804728 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.835357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.934099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.254397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.260374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.421640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:39 crc kubenswrapper[4885]: W0308 20:24:39.451627 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f86ef9b_5ca5_425b_aec8_17efc661afb8.slice/crio-c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe WatchSource:0}: Error finding container c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe: Status 404 returned error can't find the container with id c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.277740 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.277882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.278421 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.283066 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.283152 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.299326 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.299390 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.312113 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.315168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.318441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.343099 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zr2n" podStartSLOduration=2.886295057 podStartE2EDuration="5.343068231s" podCreationTimestamp="2026-03-08 20:24:36 +0000 UTC" firstStartedPulling="2026-03-08 20:24:38.247579665 +0000 UTC m=+3179.643633688" lastFinishedPulling="2026-03-08 20:24:40.704352799 +0000 UTC m=+3182.100406862" observedRunningTime="2026-03-08 20:24:41.338211571 +0000 UTC m=+3182.734265644" watchObservedRunningTime="2026-03-08 20:24:41.343068231 +0000 UTC m=+3182.739122294" Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.367558 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5wgq8" podStartSLOduration=1.8996834219999998 podStartE2EDuration="4.367541962s" podCreationTimestamp="2026-03-08 20:24:37 +0000 UTC" firstStartedPulling="2026-03-08 20:24:38.24402052 +0000 UTC m=+3179.640074553" lastFinishedPulling="2026-03-08 20:24:40.71187904 +0000 UTC m=+3182.107933093" observedRunningTime="2026-03-08 20:24:41.364863561 +0000 UTC m=+3182.760917654" watchObservedRunningTime="2026-03-08 20:24:41.367541962 +0000 UTC m=+3182.763595985" Mar 08 20:24:42 crc kubenswrapper[4885]: I0308 20:24:42.336025 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" exitCode=0 Mar 08 20:24:42 crc kubenswrapper[4885]: I0308 20:24:42.336119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} Mar 08 20:24:43 crc kubenswrapper[4885]: I0308 20:24:43.343673 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} Mar 08 20:24:43 crc kubenswrapper[4885]: I0308 20:24:43.367993 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5v28s" podStartSLOduration=2.892734866 podStartE2EDuration="5.367975272s" podCreationTimestamp="2026-03-08 20:24:38 +0000 UTC" firstStartedPulling="2026-03-08 20:24:40.280545551 +0000 UTC m=+3181.676599584" lastFinishedPulling="2026-03-08 20:24:42.755785957 +0000 UTC m=+3184.151839990" observedRunningTime="2026-03-08 20:24:43.366068762 +0000 UTC m=+3184.762122795" watchObservedRunningTime="2026-03-08 20:24:43.367975272 +0000 UTC m=+3184.764029295" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.565613 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.565906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.634007 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.457053 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.552294 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.552698 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.626045 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5wgq8" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" probeResult="failure" output=< Mar 08 20:24:48 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:24:48 crc kubenswrapper[4885]: > Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.934689 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.934790 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.007702 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.205456 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.398588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zr2n" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" containerID="cri-o://ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" gracePeriod=2 Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.484288 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.924357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086429 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.087608 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities" (OuterVolumeSpecName: "utilities") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.095485 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4" (OuterVolumeSpecName: "kube-api-access-dpzr4") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "kube-api-access-dpzr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.111097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188108 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188677 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188745 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414063 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" exitCode=0 Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414158 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"d8abcb92c837d0e997506f5dff9884df30ccd03f8039a5bdc556e91073778235"} Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414445 4885 scope.go:117] "RemoveContainer" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.444519 4885 scope.go:117] "RemoveContainer" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.477134 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.488674 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.500257 4885 scope.go:117] "RemoveContainer" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.525006 4885 scope.go:117] "RemoveContainer" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.538344 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": container with ID starting with ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef not found: ID does not exist" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.538409 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} err="failed to get container status \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": rpc error: code = NotFound desc = could not find container \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": container with ID starting with ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef not found: ID does not exist" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.538449 4885 scope.go:117] "RemoveContainer" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.539257 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": container with ID starting with cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a not found: ID does not exist" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.539320 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} err="failed to get container status \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": rpc error: code = NotFound desc = could not find container \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": container with ID starting with cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a not found: ID does not exist" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.539361 4885 scope.go:117] "RemoveContainer" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.540247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": container with ID starting with e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d not found: ID does not exist" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.540312 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d"} err="failed to get container status \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": rpc error: code = NotFound desc = could not find container \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": container with ID starting with e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d not found: ID does not exist" Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.383025 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc24a73-d641-47de-9542-5898804547cf" path="/var/lib/kubelet/pods/9dc24a73-d641-47de-9542-5898804547cf/volumes" Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.402640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.425126 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5v28s" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" containerID="cri-o://c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" gracePeriod=2 Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.919833 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026094 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026165 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026260 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.027945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities" (OuterVolumeSpecName: "utilities") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.035977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4" (OuterVolumeSpecName: "kube-api-access-8b2r4") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "kube-api-access-8b2r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.108590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128272 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128321 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128343 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438600 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" exitCode=0 Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438665 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe"} Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438782 4885 scope.go:117] "RemoveContainer" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.468238 4885 scope.go:117] "RemoveContainer" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.498098 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.507378 4885 scope.go:117] "RemoveContainer" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.508468 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.542381 4885 scope.go:117] "RemoveContainer" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.543042 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": container with ID starting with c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0 not found: ID does not exist" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543099 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} err="failed to get container status \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": rpc error: code = NotFound desc = could not find container \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": container with ID starting with c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0 not found: ID does not exist" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543141 4885 scope.go:117] "RemoveContainer" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.543668 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": container with ID starting with a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714 not found: ID does not exist" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543718 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} err="failed to get container status \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": rpc error: code = NotFound desc = could not find container \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": container with ID starting with a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714 not found: ID does not exist" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543757 4885 scope.go:117] "RemoveContainer" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.544445 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": container with ID starting with 8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b not found: ID does not exist" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.544538 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b"} err="failed to get container status \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": rpc error: code = NotFound desc = could not find container \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": container with ID starting with 8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b not found: ID does not exist" Mar 08 20:24:53 crc kubenswrapper[4885]: I0308 20:24:53.381350 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" path="/var/lib/kubelet/pods/0f86ef9b-5ca5-425b-aec8-17efc661afb8/volumes" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.627562 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.692701 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.912610 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:59 crc kubenswrapper[4885]: I0308 20:24:59.502500 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5wgq8" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" containerID="cri-o://e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" gracePeriod=2 Mar 08 20:24:59 crc kubenswrapper[4885]: I0308 20:24:59.972263 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152625 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152759 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.153973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities" (OuterVolumeSpecName: "utilities") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.162798 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js" (OuterVolumeSpecName: "kube-api-access-695js") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "kube-api-access-695js". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.267493 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.267557 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.328386 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.368689 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515718 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" exitCode=0 Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515810 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515816 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.517821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970"} Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.517865 4885 scope.go:117] "RemoveContainer" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.549302 4885 scope.go:117] "RemoveContainer" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.571468 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.578648 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.596117 4885 scope.go:117] "RemoveContainer" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625050 4885 scope.go:117] "RemoveContainer" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.625697 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": container with ID starting with e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8 not found: ID does not exist" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625766 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} err="failed to get container status \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": rpc error: code = NotFound desc = could not find container \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": container with ID starting with e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8 not found: ID does not exist" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625809 4885 scope.go:117] "RemoveContainer" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.626457 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": container with ID starting with cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d not found: ID does not exist" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.626512 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} err="failed to get container status \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": rpc error: code = NotFound desc = could not find container \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": container with ID starting with cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d not found: ID does not exist" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.626550 4885 scope.go:117] "RemoveContainer" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.626981 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": container with ID starting with 76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755 not found: ID does not exist" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.627022 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755"} err="failed to get container status \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": rpc error: code = NotFound desc = could not find container \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": container with ID starting with 76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755 not found: ID does not exist" Mar 08 20:25:01 crc kubenswrapper[4885]: I0308 20:25:01.328914 4885 scope.go:117] "RemoveContainer" containerID="bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370" Mar 08 20:25:01 crc kubenswrapper[4885]: I0308 20:25:01.386150 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" path="/var/lib/kubelet/pods/37ef9b83-5e44-4dd4-917b-0cc2b22994a7/volumes" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.818596 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.818997 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.819068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.820008 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.820106 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" gracePeriod=600 Mar 08 20:25:02 crc kubenswrapper[4885]: E0308 20:25:02.957006 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547125 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" exitCode=0 Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547230 4885 scope.go:117] "RemoveContainer" containerID="91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.548095 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:03 crc kubenswrapper[4885]: E0308 20:25:03.548576 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:18 crc kubenswrapper[4885]: I0308 20:25:18.369934 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:18 crc kubenswrapper[4885]: E0308 20:25:18.371148 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:29 crc kubenswrapper[4885]: I0308 20:25:29.376397 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:29 crc kubenswrapper[4885]: E0308 20:25:29.377097 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:41 crc kubenswrapper[4885]: I0308 20:25:41.369005 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:41 crc kubenswrapper[4885]: E0308 20:25:41.370190 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:56 crc kubenswrapper[4885]: I0308 20:25:56.368915 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:56 crc kubenswrapper[4885]: E0308 20:25:56.369951 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.165778 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166842 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.166873 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166905 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.166961 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166990 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167031 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167047 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167086 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167103 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167135 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167155 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167171 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167219 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.168596 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.168682 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169085 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169134 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169219 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.170197 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.176640 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.176884 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.177287 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.181183 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.195830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.296871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.326741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.503256 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.803638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:01 crc kubenswrapper[4885]: I0308 20:26:01.069716 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerStarted","Data":"d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a"} Mar 08 20:26:02 crc kubenswrapper[4885]: I0308 20:26:02.081087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerStarted","Data":"d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd"} Mar 08 20:26:02 crc kubenswrapper[4885]: I0308 20:26:02.106595 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" podStartSLOduration=1.294364064 podStartE2EDuration="2.106565446s" podCreationTimestamp="2026-03-08 20:26:00 +0000 UTC" firstStartedPulling="2026-03-08 20:26:00.810878497 +0000 UTC m=+3262.206932560" lastFinishedPulling="2026-03-08 20:26:01.623079879 +0000 UTC m=+3263.019133942" observedRunningTime="2026-03-08 20:26:02.099320303 +0000 UTC m=+3263.495374326" watchObservedRunningTime="2026-03-08 20:26:02.106565446 +0000 UTC m=+3263.502619519" Mar 08 20:26:03 crc kubenswrapper[4885]: I0308 20:26:03.094609 4885 generic.go:334] "Generic (PLEG): container finished" podID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerID="d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd" exitCode=0 Mar 08 20:26:03 crc kubenswrapper[4885]: I0308 20:26:03.094739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerDied","Data":"d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd"} Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.449354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.471355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"99bd2190-0abe-4434-b2f6-3707852e2d43\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.478282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx" (OuterVolumeSpecName: "kube-api-access-t2jnx") pod "99bd2190-0abe-4434-b2f6-3707852e2d43" (UID: "99bd2190-0abe-4434-b2f6-3707852e2d43"). InnerVolumeSpecName "kube-api-access-t2jnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.573249 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") on node \"crc\" DevicePath \"\"" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerDied","Data":"d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a"} Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121767 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121797 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.257901 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.267277 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.384290 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" path="/var/lib/kubelet/pods/c98e098b-03c5-49cd-8009-4d1dde33cd6d/volumes" Mar 08 20:26:11 crc kubenswrapper[4885]: I0308 20:26:11.368987 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:11 crc kubenswrapper[4885]: E0308 20:26:11.370243 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:26 crc kubenswrapper[4885]: I0308 20:26:26.368594 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:26 crc kubenswrapper[4885]: E0308 20:26:26.370552 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:39 crc kubenswrapper[4885]: I0308 20:26:39.375167 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:39 crc kubenswrapper[4885]: E0308 20:26:39.377545 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:52 crc kubenswrapper[4885]: I0308 20:26:52.369189 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:52 crc kubenswrapper[4885]: E0308 20:26:52.370268 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:01 crc kubenswrapper[4885]: I0308 20:27:01.482801 4885 scope.go:117] "RemoveContainer" containerID="c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476" Mar 08 20:27:06 crc kubenswrapper[4885]: I0308 20:27:06.369247 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:06 crc kubenswrapper[4885]: E0308 20:27:06.370319 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:19 crc kubenswrapper[4885]: I0308 20:27:19.377123 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:19 crc kubenswrapper[4885]: E0308 20:27:19.378066 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:31 crc kubenswrapper[4885]: I0308 20:27:31.369028 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:31 crc kubenswrapper[4885]: E0308 20:27:31.370034 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:45 crc kubenswrapper[4885]: I0308 20:27:45.368563 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:45 crc kubenswrapper[4885]: E0308 20:27:45.371795 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:57 crc kubenswrapper[4885]: I0308 20:27:57.369068 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:57 crc kubenswrapper[4885]: E0308 20:27:57.370289 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.160607 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:00 crc kubenswrapper[4885]: E0308 20:28:00.161125 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.161154 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.161484 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.162422 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165297 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165356 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165445 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.180036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.293366 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.395622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.429962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.498553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.016265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.028321 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.162236 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerStarted","Data":"6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7"} Mar 08 20:28:03 crc kubenswrapper[4885]: I0308 20:28:03.180903 4885 generic.go:334] "Generic (PLEG): container finished" podID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerID="ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3" exitCode=0 Mar 08 20:28:03 crc kubenswrapper[4885]: I0308 20:28:03.181008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerDied","Data":"ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3"} Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.555460 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.662459 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.680256 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c" (OuterVolumeSpecName: "kube-api-access-ldn5c") pod "67320c5c-bbf3-4828-8a46-effb28e4d9a1" (UID: "67320c5c-bbf3-4828-8a46-effb28e4d9a1"). InnerVolumeSpecName "kube-api-access-ldn5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.764414 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.201777 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerDied","Data":"6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7"} Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.202170 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.201864 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.641794 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.651811 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:28:07 crc kubenswrapper[4885]: I0308 20:28:07.380857 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" path="/var/lib/kubelet/pods/9c1236a4-ac5f-4b66-8064-a0877ea3eb13/volumes" Mar 08 20:28:12 crc kubenswrapper[4885]: I0308 20:28:12.368094 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:12 crc kubenswrapper[4885]: E0308 20:28:12.368871 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:24 crc kubenswrapper[4885]: I0308 20:28:24.367891 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:24 crc kubenswrapper[4885]: E0308 20:28:24.368611 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:35 crc kubenswrapper[4885]: I0308 20:28:35.368669 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:35 crc kubenswrapper[4885]: E0308 20:28:35.369836 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.869510 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:37 crc kubenswrapper[4885]: E0308 20:28:37.870608 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.870638 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.871026 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.880606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.913343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989331 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.990069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.990683 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.014407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.203094 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.709353 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506011 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" exitCode=0 Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba"} Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"1866ccbe71f36db1e42b7cbb8c3b67be8a85a9ae35ab075668dd6b8f7edcac68"} Mar 08 20:28:40 crc kubenswrapper[4885]: I0308 20:28:40.518251 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} Mar 08 20:28:41 crc kubenswrapper[4885]: I0308 20:28:41.531874 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" exitCode=0 Mar 08 20:28:41 crc kubenswrapper[4885]: I0308 20:28:41.532131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} Mar 08 20:28:42 crc kubenswrapper[4885]: I0308 20:28:42.542413 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} Mar 08 20:28:42 crc kubenswrapper[4885]: I0308 20:28:42.577209 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-82wff" podStartSLOduration=3.139257967 podStartE2EDuration="5.577192189s" podCreationTimestamp="2026-03-08 20:28:37 +0000 UTC" firstStartedPulling="2026-03-08 20:28:39.508749894 +0000 UTC m=+3420.904803947" lastFinishedPulling="2026-03-08 20:28:41.946684106 +0000 UTC m=+3423.342738169" observedRunningTime="2026-03-08 20:28:42.562821897 +0000 UTC m=+3423.958875930" watchObservedRunningTime="2026-03-08 20:28:42.577192189 +0000 UTC m=+3423.973246222" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.204334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.204721 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.261307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.671205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:49 crc kubenswrapper[4885]: I0308 20:28:49.377810 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:49 crc kubenswrapper[4885]: E0308 20:28:49.378322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:49 crc kubenswrapper[4885]: I0308 20:28:49.712074 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:50 crc kubenswrapper[4885]: I0308 20:28:50.619424 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-82wff" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" containerID="cri-o://3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" gracePeriod=2 Mar 08 20:28:50 crc kubenswrapper[4885]: E0308 20:28:50.836215 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21cc25eb_5e12_4a42_ae79_a8ed17b3a437.slice/crio-conmon-3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.172000 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221508 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.223367 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities" (OuterVolumeSpecName: "utilities") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.229835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc" (OuterVolumeSpecName: "kube-api-access-s44hc") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "kube-api-access-s44hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.322527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323015 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323067 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323087 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631701 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" exitCode=0 Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631765 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"1866ccbe71f36db1e42b7cbb8c3b67be8a85a9ae35ab075668dd6b8f7edcac68"} Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631825 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631843 4885 scope.go:117] "RemoveContainer" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.668634 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.675593 4885 scope.go:117] "RemoveContainer" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.681776 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.707356 4885 scope.go:117] "RemoveContainer" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.758429 4885 scope.go:117] "RemoveContainer" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.759994 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": container with ID starting with 3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c not found: ID does not exist" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} err="failed to get container status \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": rpc error: code = NotFound desc = could not find container \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": container with ID starting with 3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c not found: ID does not exist" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760339 4885 scope.go:117] "RemoveContainer" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.760943 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": container with ID starting with 1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132 not found: ID does not exist" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760975 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} err="failed to get container status \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": rpc error: code = NotFound desc = could not find container \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": container with ID starting with 1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132 not found: ID does not exist" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760994 4885 scope.go:117] "RemoveContainer" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.761460 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": container with ID starting with faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba not found: ID does not exist" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.761512 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba"} err="failed to get container status \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": rpc error: code = NotFound desc = could not find container \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": container with ID starting with faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba not found: ID does not exist" Mar 08 20:28:53 crc kubenswrapper[4885]: I0308 20:28:53.386701 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" path="/var/lib/kubelet/pods/21cc25eb-5e12-4a42-ae79-a8ed17b3a437/volumes" Mar 08 20:29:01 crc kubenswrapper[4885]: I0308 20:29:01.368897 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:01 crc kubenswrapper[4885]: E0308 20:29:01.370081 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:01 crc kubenswrapper[4885]: I0308 20:29:01.580924 4885 scope.go:117] "RemoveContainer" containerID="02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea" Mar 08 20:29:15 crc kubenswrapper[4885]: I0308 20:29:15.368065 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:15 crc kubenswrapper[4885]: E0308 20:29:15.369013 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:27 crc kubenswrapper[4885]: I0308 20:29:27.369239 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:27 crc kubenswrapper[4885]: E0308 20:29:27.369959 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:39 crc kubenswrapper[4885]: I0308 20:29:39.376304 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:39 crc kubenswrapper[4885]: E0308 20:29:39.376980 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:51 crc kubenswrapper[4885]: I0308 20:29:51.368451 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:51 crc kubenswrapper[4885]: E0308 20:29:51.369411 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.142677 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-content" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143607 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-content" Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143644 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143658 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-utilities" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143666 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-utilities" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143830 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.144412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.147855 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.148231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.150532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.158936 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.160257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.163248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.167914 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.171306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.180417 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.327056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429401 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.431916 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.445598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.451613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.467149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.474325 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.486812 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.789743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: W0308 20:30:00.999559 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb137f1c0_32d5_44b8_b0e3_a7ae07052e53.slice/crio-d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0 WatchSource:0}: Error finding container d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0: Status 404 returned error can't find the container with id d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0 Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.000679 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260114 4885 generic.go:334] "Generic (PLEG): container finished" podID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerID="ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad" exitCode=0 Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerDied","Data":"ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad"} Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260796 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerStarted","Data":"109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570"} Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.262206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerStarted","Data":"d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0"} Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.633785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.684351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.684482 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.685075 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.685709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume" (OuterVolumeSpecName: "config-volume") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.691762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.693036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9" (OuterVolumeSpecName: "kube-api-access-fttm9") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "kube-api-access-fttm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.785961 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.786000 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.786013 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.283852 4885 generic.go:334] "Generic (PLEG): container finished" podID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerID="f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4" exitCode=0 Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.284007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerDied","Data":"f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4"} Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.290979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerDied","Data":"109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570"} Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.291048 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.291151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.728569 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.735958 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.628212 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.722986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.732991 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd" (OuterVolumeSpecName: "kube-api-access-q96rd") pod "b137f1c0-32d5-44b8-b0e3-a7ae07052e53" (UID: "b137f1c0-32d5-44b8-b0e3-a7ae07052e53"). InnerVolumeSpecName "kube-api-access-q96rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.824468 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311060 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerDied","Data":"d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0"} Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311117 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.368812 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.385136 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" path="/var/lib/kubelet/pods/f8673a65-b7c8-4c06-9713-a095b399358a/volumes" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.691380 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.701501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:30:06 crc kubenswrapper[4885]: I0308 20:30:06.321955 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} Mar 08 20:30:07 crc kubenswrapper[4885]: I0308 20:30:07.385584 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" path="/var/lib/kubelet/pods/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104/volumes" Mar 08 20:31:01 crc kubenswrapper[4885]: I0308 20:31:01.699197 4885 scope.go:117] "RemoveContainer" containerID="dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209" Mar 08 20:31:01 crc kubenswrapper[4885]: I0308 20:31:01.733810 4885 scope.go:117] "RemoveContainer" containerID="43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.158621 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:00 crc kubenswrapper[4885]: E0308 20:32:00.159739 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.159764 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: E0308 20:32:00.159788 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.159802 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160059 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160098 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.165073 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.165570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.167283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.174210 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.361136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.463389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.495881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.521941 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.774350 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:01 crc kubenswrapper[4885]: I0308 20:32:01.419442 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerStarted","Data":"ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b"} Mar 08 20:32:02 crc kubenswrapper[4885]: I0308 20:32:02.432195 4885 generic.go:334] "Generic (PLEG): container finished" podID="d9ba305f-b091-419d-ba98-701437bceab1" containerID="bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51" exitCode=0 Mar 08 20:32:02 crc kubenswrapper[4885]: I0308 20:32:02.432493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerDied","Data":"bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51"} Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.767137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.920636 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"d9ba305f-b091-419d-ba98-701437bceab1\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.928129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7" (OuterVolumeSpecName: "kube-api-access-flgv7") pod "d9ba305f-b091-419d-ba98-701437bceab1" (UID: "d9ba305f-b091-419d-ba98-701437bceab1"). InnerVolumeSpecName "kube-api-access-flgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.022870 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") on node \"crc\" DevicePath \"\"" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerDied","Data":"ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b"} Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451788 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451832 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.860684 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.867552 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:32:05 crc kubenswrapper[4885]: I0308 20:32:05.384891 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" path="/var/lib/kubelet/pods/99bd2190-0abe-4434-b2f6-3707852e2d43/volumes" Mar 08 20:32:32 crc kubenswrapper[4885]: I0308 20:32:32.818713 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:32:32 crc kubenswrapper[4885]: I0308 20:32:32.819604 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:01 crc kubenswrapper[4885]: I0308 20:33:01.885716 4885 scope.go:117] "RemoveContainer" containerID="d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd" Mar 08 20:33:02 crc kubenswrapper[4885]: I0308 20:33:02.818381 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:33:02 crc kubenswrapper[4885]: I0308 20:33:02.818791 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.819128 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.819889 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.820006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.821458 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.821576 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" gracePeriod=600 Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283157 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" exitCode=0 Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283301 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.158487 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: E0308 20:34:00.159530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.159554 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.159801 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.160573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.163626 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.164052 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.164443 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.177955 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.267204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.368857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.401275 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.499300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.779437 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.785642 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:34:01 crc kubenswrapper[4885]: I0308 20:34:01.558780 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerStarted","Data":"8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1"} Mar 08 20:34:02 crc kubenswrapper[4885]: I0308 20:34:02.571601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerStarted","Data":"4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48"} Mar 08 20:34:02 crc kubenswrapper[4885]: I0308 20:34:02.592747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" podStartSLOduration=1.276775999 podStartE2EDuration="2.592714429s" podCreationTimestamp="2026-03-08 20:34:00 +0000 UTC" firstStartedPulling="2026-03-08 20:34:00.785293325 +0000 UTC m=+3742.181347358" lastFinishedPulling="2026-03-08 20:34:02.101231735 +0000 UTC m=+3743.497285788" observedRunningTime="2026-03-08 20:34:02.589215116 +0000 UTC m=+3743.985269199" watchObservedRunningTime="2026-03-08 20:34:02.592714429 +0000 UTC m=+3743.988768492" Mar 08 20:34:03 crc kubenswrapper[4885]: I0308 20:34:03.582796 4885 generic.go:334] "Generic (PLEG): container finished" podID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerID="4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48" exitCode=0 Mar 08 20:34:03 crc kubenswrapper[4885]: I0308 20:34:03.583006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerDied","Data":"4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48"} Mar 08 20:34:04 crc kubenswrapper[4885]: I0308 20:34:04.957666 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.051690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"d02d14a7-00be-4808-9f97-ac3c16ae727a\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.060901 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2" (OuterVolumeSpecName: "kube-api-access-jqxt2") pod "d02d14a7-00be-4808-9f97-ac3c16ae727a" (UID: "d02d14a7-00be-4808-9f97-ac3c16ae727a"). InnerVolumeSpecName "kube-api-access-jqxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.153996 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") on node \"crc\" DevicePath \"\"" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.604794 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerDied","Data":"8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1"} Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.605148 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.605021 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.688648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.694969 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:34:07 crc kubenswrapper[4885]: I0308 20:34:07.384861 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" path="/var/lib/kubelet/pods/67320c5c-bbf3-4828-8a46-effb28e4d9a1/volumes" Mar 08 20:35:02 crc kubenswrapper[4885]: I0308 20:35:02.033481 4885 scope.go:117] "RemoveContainer" containerID="ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.485402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:22 crc kubenswrapper[4885]: E0308 20:35:22.487234 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.487255 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.487459 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.489146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.503751 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.607972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.642576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.856761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:23 crc kubenswrapper[4885]: I0308 20:35:23.322430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:23 crc kubenswrapper[4885]: I0308 20:35:23.379972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"95664d8f3885947db08492e1323e820bc6d9db6195795acb46f2951a840b2a88"} Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.391094 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" exitCode=0 Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.391178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3"} Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.859603 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.862519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.879328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.044638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.045004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.045435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147579 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147850 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.174076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.198052 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.403267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.498812 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.418030 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" exitCode=0 Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.418163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433227 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" exitCode=0 Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe"} Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"04c997db414c675653cb091d0ddce7408e517b59dd141334c3f4342fbea0c087"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.441046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.442605 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.481816 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwm5v" podStartSLOduration=3.050130042 podStartE2EDuration="5.481801455s" podCreationTimestamp="2026-03-08 20:35:22 +0000 UTC" firstStartedPulling="2026-03-08 20:35:24.392977204 +0000 UTC m=+3825.789031267" lastFinishedPulling="2026-03-08 20:35:26.824648617 +0000 UTC m=+3828.220702680" observedRunningTime="2026-03-08 20:35:27.463323322 +0000 UTC m=+3828.859377355" watchObservedRunningTime="2026-03-08 20:35:27.481801455 +0000 UTC m=+3828.877855478" Mar 08 20:35:28 crc kubenswrapper[4885]: I0308 20:35:28.455346 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" exitCode=0 Mar 08 20:35:28 crc kubenswrapper[4885]: I0308 20:35:28.455512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} Mar 08 20:35:29 crc kubenswrapper[4885]: I0308 20:35:29.468123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} Mar 08 20:35:29 crc kubenswrapper[4885]: I0308 20:35:29.502467 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-76dq7" podStartSLOduration=3.012944995 podStartE2EDuration="5.502444192s" podCreationTimestamp="2026-03-08 20:35:24 +0000 UTC" firstStartedPulling="2026-03-08 20:35:26.436950735 +0000 UTC m=+3827.833004788" lastFinishedPulling="2026-03-08 20:35:28.926449932 +0000 UTC m=+3830.322503985" observedRunningTime="2026-03-08 20:35:29.495887678 +0000 UTC m=+3830.891941731" watchObservedRunningTime="2026-03-08 20:35:29.502444192 +0000 UTC m=+3830.898498245" Mar 08 20:35:32 crc kubenswrapper[4885]: I0308 20:35:32.857031 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:32 crc kubenswrapper[4885]: I0308 20:35:32.858169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:33 crc kubenswrapper[4885]: I0308 20:35:33.916704 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwm5v" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" probeResult="failure" output=< Mar 08 20:35:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:35:33 crc kubenswrapper[4885]: > Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.199391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.199483 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.271075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.593595 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.658542 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:37 crc kubenswrapper[4885]: I0308 20:35:37.546854 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-76dq7" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" containerID="cri-o://e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" gracePeriod=2 Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.096124 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.265843 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.267490 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities" (OuterVolumeSpecName: "utilities") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.267862 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.268003 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.268762 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.276048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92" (OuterVolumeSpecName: "kube-api-access-r6v92") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "kube-api-access-r6v92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.370821 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.510482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563504 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" exitCode=0 Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563618 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563644 4885 scope.go:117] "RemoveContainer" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"04c997db414c675653cb091d0ddce7408e517b59dd141334c3f4342fbea0c087"} Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.575630 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.597243 4885 scope.go:117] "RemoveContainer" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.620281 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.631584 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.640534 4885 scope.go:117] "RemoveContainer" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.684390 4885 scope.go:117] "RemoveContainer" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.684986 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": container with ID starting with e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113 not found: ID does not exist" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685039 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} err="failed to get container status \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": rpc error: code = NotFound desc = could not find container \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": container with ID starting with e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113 not found: ID does not exist" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685071 4885 scope.go:117] "RemoveContainer" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.685527 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": container with ID starting with 236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339 not found: ID does not exist" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685566 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} err="failed to get container status \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": rpc error: code = NotFound desc = could not find container \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": container with ID starting with 236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339 not found: ID does not exist" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685622 4885 scope.go:117] "RemoveContainer" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.686199 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": container with ID starting with 27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe not found: ID does not exist" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.686259 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe"} err="failed to get container status \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": rpc error: code = NotFound desc = could not find container \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": container with ID starting with 27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe not found: ID does not exist" Mar 08 20:35:39 crc kubenswrapper[4885]: I0308 20:35:39.391739 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" path="/var/lib/kubelet/pods/091a1ce3-4352-409e-aa25-b111c2b266f2/volumes" Mar 08 20:35:42 crc kubenswrapper[4885]: I0308 20:35:42.932599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:43 crc kubenswrapper[4885]: I0308 20:35:43.011434 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:43 crc kubenswrapper[4885]: I0308 20:35:43.180387 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:44 crc kubenswrapper[4885]: I0308 20:35:44.623366 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwm5v" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" containerID="cri-o://0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" gracePeriod=2 Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.095185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186480 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.188779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities" (OuterVolumeSpecName: "utilities") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.192777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz" (OuterVolumeSpecName: "kube-api-access-p62bz") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "kube-api-access-p62bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.288729 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.288794 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.416596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.491800 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635069 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" exitCode=0 Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635182 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"95664d8f3885947db08492e1323e820bc6d9db6195795acb46f2951a840b2a88"} Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635229 4885 scope.go:117] "RemoveContainer" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.674795 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.682082 4885 scope.go:117] "RemoveContainer" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.685534 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.712307 4885 scope.go:117] "RemoveContainer" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.747747 4885 scope.go:117] "RemoveContainer" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.748396 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": container with ID starting with 0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188 not found: ID does not exist" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.748467 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} err="failed to get container status \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": rpc error: code = NotFound desc = could not find container \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": container with ID starting with 0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.748512 4885 scope.go:117] "RemoveContainer" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.749077 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": container with ID starting with 95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741 not found: ID does not exist" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749162 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} err="failed to get container status \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": rpc error: code = NotFound desc = could not find container \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": container with ID starting with 95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749201 4885 scope.go:117] "RemoveContainer" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.749700 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": container with ID starting with a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3 not found: ID does not exist" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749735 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3"} err="failed to get container status \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": rpc error: code = NotFound desc = could not find container \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": container with ID starting with a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.993307 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994539 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994612 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994664 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994683 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994699 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994714 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994740 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994758 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994799 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994854 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.995396 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.995442 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.997410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.008240 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102095 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203782 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.204302 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.204534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.230142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.332706 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.671541 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.379816 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668c7890-77ad-445e-bee1-d40844c077ce" path="/var/lib/kubelet/pods/668c7890-77ad-445e-bee1-d40844c077ce/volumes" Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.656999 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" exitCode=0 Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.657070 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96"} Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.657113 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"c5ee7465965ec77635d8950124463e0afc190160350b13cc77ce43e82b2b54fd"} Mar 08 20:35:48 crc kubenswrapper[4885]: I0308 20:35:48.674807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} Mar 08 20:35:49 crc kubenswrapper[4885]: I0308 20:35:49.686819 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" exitCode=0 Mar 08 20:35:49 crc kubenswrapper[4885]: I0308 20:35:49.686981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} Mar 08 20:35:50 crc kubenswrapper[4885]: I0308 20:35:50.698747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} Mar 08 20:35:50 crc kubenswrapper[4885]: I0308 20:35:50.727632 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pdb6" podStartSLOduration=3.285112199 podStartE2EDuration="5.727607851s" podCreationTimestamp="2026-03-08 20:35:45 +0000 UTC" firstStartedPulling="2026-03-08 20:35:47.660036788 +0000 UTC m=+3849.056090841" lastFinishedPulling="2026-03-08 20:35:50.10253243 +0000 UTC m=+3851.498586493" observedRunningTime="2026-03-08 20:35:50.724846178 +0000 UTC m=+3852.120900241" watchObservedRunningTime="2026-03-08 20:35:50.727607851 +0000 UTC m=+3852.123661904" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.332982 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.333608 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.410141 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.826423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.883068 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:58 crc kubenswrapper[4885]: I0308 20:35:58.777391 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pdb6" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" containerID="cri-o://ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" gracePeriod=2 Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.318594 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.441311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities" (OuterVolumeSpecName: "utilities") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.450176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl" (OuterVolumeSpecName: "kube-api-access-p24zl") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "kube-api-access-p24zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.493142 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.541803 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.542023 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.542232 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.790985 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" exitCode=0 Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"c5ee7465965ec77635d8950124463e0afc190160350b13cc77ce43e82b2b54fd"} Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791185 4885 scope.go:117] "RemoveContainer" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791188 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.828640 4885 scope.go:117] "RemoveContainer" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.855404 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.867888 4885 scope.go:117] "RemoveContainer" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.870201 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.905801 4885 scope.go:117] "RemoveContainer" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.906689 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": container with ID starting with ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8 not found: ID does not exist" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.907034 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} err="failed to get container status \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": rpc error: code = NotFound desc = could not find container \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": container with ID starting with ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8 not found: ID does not exist" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.907910 4885 scope.go:117] "RemoveContainer" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.908824 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": container with ID starting with e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f not found: ID does not exist" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.908890 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} err="failed to get container status \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": rpc error: code = NotFound desc = could not find container \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": container with ID starting with e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f not found: ID does not exist" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.908965 4885 scope.go:117] "RemoveContainer" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.909462 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": container with ID starting with 3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96 not found: ID does not exist" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.909515 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96"} err="failed to get container status \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": rpc error: code = NotFound desc = could not find container \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": container with ID starting with 3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96 not found: ID does not exist" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.157750 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158318 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158340 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-content" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158348 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-content" Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158361 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-utilities" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158367 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-utilities" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158512 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158928 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.163355 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.163576 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.164053 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.226346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.253805 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.355034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.379454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.532510 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.812810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:01 crc kubenswrapper[4885]: I0308 20:36:01.383476 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" path="/var/lib/kubelet/pods/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb/volumes" Mar 08 20:36:01 crc kubenswrapper[4885]: I0308 20:36:01.821665 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerStarted","Data":"c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5"} Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.818735 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.819156 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.834761 4885 generic.go:334] "Generic (PLEG): container finished" podID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerID="4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4" exitCode=0 Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.834845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerDied","Data":"4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4"} Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.269135 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.417730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.426267 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc" (OuterVolumeSpecName: "kube-api-access-ggcgc") pod "4ba33559-23ce-4dec-a0fb-d479e47d6f1c" (UID: "4ba33559-23ce-4dec-a0fb-d479e47d6f1c"). InnerVolumeSpecName "kube-api-access-ggcgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.524387 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") on node \"crc\" DevicePath \"\"" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerDied","Data":"c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5"} Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856777 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856795 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:05 crc kubenswrapper[4885]: I0308 20:36:05.356201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:36:05 crc kubenswrapper[4885]: I0308 20:36:05.384561 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:36:07 crc kubenswrapper[4885]: I0308 20:36:07.386196 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" path="/var/lib/kubelet/pods/b137f1c0-32d5-44b8-b0e3-a7ae07052e53/volumes" Mar 08 20:36:32 crc kubenswrapper[4885]: I0308 20:36:32.818805 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:36:32 crc kubenswrapper[4885]: I0308 20:36:32.819813 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.232593 4885 scope.go:117] "RemoveContainer" containerID="f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.818741 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.819007 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.819208 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.821098 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.821273 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" gracePeriod=600 Mar 08 20:37:03 crc kubenswrapper[4885]: E0308 20:37:03.020609 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442016 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" exitCode=0 Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442077 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442126 4885 scope.go:117] "RemoveContainer" containerID="d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.444234 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:03 crc kubenswrapper[4885]: E0308 20:37:03.444817 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:18 crc kubenswrapper[4885]: I0308 20:37:18.368603 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:18 crc kubenswrapper[4885]: E0308 20:37:18.369322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:31 crc kubenswrapper[4885]: I0308 20:37:31.370080 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:31 crc kubenswrapper[4885]: E0308 20:37:31.374974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:43 crc kubenswrapper[4885]: I0308 20:37:43.369027 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:43 crc kubenswrapper[4885]: E0308 20:37:43.369894 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:55 crc kubenswrapper[4885]: I0308 20:37:55.368383 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:55 crc kubenswrapper[4885]: E0308 20:37:55.369446 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.154155 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:00 crc kubenswrapper[4885]: E0308 20:38:00.154885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.154904 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.155197 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.156137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.158865 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.159289 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.159348 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.174097 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.331035 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.432577 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.467118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.481069 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:01 crc kubenswrapper[4885]: I0308 20:38:01.003386 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:01 crc kubenswrapper[4885]: I0308 20:38:01.945518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerStarted","Data":"9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973"} Mar 08 20:38:02 crc kubenswrapper[4885]: I0308 20:38:02.954201 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerID="dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627" exitCode=0 Mar 08 20:38:02 crc kubenswrapper[4885]: I0308 20:38:02.954263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerDied","Data":"dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627"} Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.411673 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.602836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"5d4b8072-1e19-4a26-b038-2a3c6d634760\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.611367 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf" (OuterVolumeSpecName: "kube-api-access-swfkf") pod "5d4b8072-1e19-4a26-b038-2a3c6d634760" (UID: "5d4b8072-1e19-4a26-b038-2a3c6d634760"). InnerVolumeSpecName "kube-api-access-swfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.704965 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.972983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerDied","Data":"9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973"} Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.973038 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.973119 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:05 crc kubenswrapper[4885]: I0308 20:38:05.508734 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:38:05 crc kubenswrapper[4885]: I0308 20:38:05.519180 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:38:07 crc kubenswrapper[4885]: I0308 20:38:07.385206 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ba305f-b091-419d-ba98-701437bceab1" path="/var/lib/kubelet/pods/d9ba305f-b091-419d-ba98-701437bceab1/volumes" Mar 08 20:38:09 crc kubenswrapper[4885]: I0308 20:38:09.378744 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:09 crc kubenswrapper[4885]: E0308 20:38:09.379627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:21 crc kubenswrapper[4885]: I0308 20:38:21.368256 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:21 crc kubenswrapper[4885]: E0308 20:38:21.369036 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:36 crc kubenswrapper[4885]: I0308 20:38:36.368043 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:36 crc kubenswrapper[4885]: E0308 20:38:36.369832 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.468476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:41 crc kubenswrapper[4885]: E0308 20:38:41.469356 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.469372 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.469531 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.470723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.494770 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639304 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639416 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741026 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.765279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.810062 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:42 crc kubenswrapper[4885]: I0308 20:38:42.313435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:42 crc kubenswrapper[4885]: W0308 20:38:42.319375 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8f1f01_8c81_4823_8ef7_85a09c6b6363.slice/crio-37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04 WatchSource:0}: Error finding container 37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04: Status 404 returned error can't find the container with id 37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04 Mar 08 20:38:42 crc kubenswrapper[4885]: I0308 20:38:42.534297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04"} Mar 08 20:38:43 crc kubenswrapper[4885]: I0308 20:38:43.551496 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" exitCode=0 Mar 08 20:38:43 crc kubenswrapper[4885]: I0308 20:38:43.551575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff"} Mar 08 20:38:44 crc kubenswrapper[4885]: I0308 20:38:44.560057 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} Mar 08 20:38:45 crc kubenswrapper[4885]: I0308 20:38:45.571110 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" exitCode=0 Mar 08 20:38:45 crc kubenswrapper[4885]: I0308 20:38:45.571163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} Mar 08 20:38:46 crc kubenswrapper[4885]: I0308 20:38:46.582486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} Mar 08 20:38:46 crc kubenswrapper[4885]: I0308 20:38:46.605580 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hd4x5" podStartSLOduration=2.979132731 podStartE2EDuration="5.605557525s" podCreationTimestamp="2026-03-08 20:38:41 +0000 UTC" firstStartedPulling="2026-03-08 20:38:43.554839521 +0000 UTC m=+4024.950893554" lastFinishedPulling="2026-03-08 20:38:46.181264315 +0000 UTC m=+4027.577318348" observedRunningTime="2026-03-08 20:38:46.603370796 +0000 UTC m=+4027.999424819" watchObservedRunningTime="2026-03-08 20:38:46.605557525 +0000 UTC m=+4028.001611588" Mar 08 20:38:50 crc kubenswrapper[4885]: I0308 20:38:50.368991 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:50 crc kubenswrapper[4885]: E0308 20:38:50.369532 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.810406 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.810504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.888044 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:52 crc kubenswrapper[4885]: I0308 20:38:52.706771 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:52 crc kubenswrapper[4885]: I0308 20:38:52.786550 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:54 crc kubenswrapper[4885]: I0308 20:38:54.649671 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hd4x5" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" containerID="cri-o://16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" gracePeriod=2 Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.650759 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.657998 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" exitCode=0 Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04"} Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658129 4885 scope.go:117] "RemoveContainer" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664679 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.668103 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities" (OuterVolumeSpecName: "utilities") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.672009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r" (OuterVolumeSpecName: "kube-api-access-rkr6r") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "kube-api-access-rkr6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.702178 4885 scope.go:117] "RemoveContainer" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.718758 4885 scope.go:117] "RemoveContainer" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.742866 4885 scope.go:117] "RemoveContainer" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.743298 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": container with ID starting with 16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c not found: ID does not exist" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743327 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} err="failed to get container status \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": rpc error: code = NotFound desc = could not find container \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": container with ID starting with 16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743346 4885 scope.go:117] "RemoveContainer" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.743657 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": container with ID starting with 2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd not found: ID does not exist" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743683 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} err="failed to get container status \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": rpc error: code = NotFound desc = could not find container \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": container with ID starting with 2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743696 4885 scope.go:117] "RemoveContainer" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.744230 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": container with ID starting with e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff not found: ID does not exist" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.744276 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff"} err="failed to get container status \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": rpc error: code = NotFound desc = could not find container \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": container with ID starting with e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765890 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765943 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765958 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:56 crc kubenswrapper[4885]: I0308 20:38:56.010051 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:56 crc kubenswrapper[4885]: I0308 20:38:56.016792 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:57 crc kubenswrapper[4885]: I0308 20:38:57.381053 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" path="/var/lib/kubelet/pods/8d8f1f01-8c81-4823-8ef7-85a09c6b6363/volumes" Mar 08 20:39:02 crc kubenswrapper[4885]: I0308 20:39:02.356611 4885 scope.go:117] "RemoveContainer" containerID="bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51" Mar 08 20:39:02 crc kubenswrapper[4885]: I0308 20:39:02.368832 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:02 crc kubenswrapper[4885]: E0308 20:39:02.369459 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:16 crc kubenswrapper[4885]: I0308 20:39:16.369244 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:16 crc kubenswrapper[4885]: E0308 20:39:16.370291 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:29 crc kubenswrapper[4885]: I0308 20:39:29.374868 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:29 crc kubenswrapper[4885]: E0308 20:39:29.375834 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:44 crc kubenswrapper[4885]: I0308 20:39:44.368776 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:44 crc kubenswrapper[4885]: E0308 20:39:44.370746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:58 crc kubenswrapper[4885]: I0308 20:39:58.369214 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:58 crc kubenswrapper[4885]: E0308 20:39:58.372330 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.147919 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148190 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-content" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148202 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-content" Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148213 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-utilities" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148219 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-utilities" Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148244 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148251 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148367 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.153453 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.163538 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.163786 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.167672 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.308551 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.410065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.430228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.481767 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.816705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:01 crc kubenswrapper[4885]: I0308 20:40:01.173125 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:40:01 crc kubenswrapper[4885]: I0308 20:40:01.294004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerStarted","Data":"6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691"} Mar 08 20:40:04 crc kubenswrapper[4885]: I0308 20:40:04.324597 4885 generic.go:334] "Generic (PLEG): container finished" podID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerID="cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a" exitCode=0 Mar 08 20:40:04 crc kubenswrapper[4885]: I0308 20:40:04.324669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerDied","Data":"cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a"} Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.744009 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.795724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"f06a86c2-3f71-42f0-8f33-558ffba8e527\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.803870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs" (OuterVolumeSpecName: "kube-api-access-vttcs") pod "f06a86c2-3f71-42f0-8f33-558ffba8e527" (UID: "f06a86c2-3f71-42f0-8f33-558ffba8e527"). InnerVolumeSpecName "kube-api-access-vttcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.897400 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") on node \"crc\" DevicePath \"\"" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.345443 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerDied","Data":"6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691"} Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.345833 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.346066 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.837829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.845248 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:40:07 crc kubenswrapper[4885]: I0308 20:40:07.385692 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" path="/var/lib/kubelet/pods/d02d14a7-00be-4808-9f97-ac3c16ae727a/volumes" Mar 08 20:40:13 crc kubenswrapper[4885]: I0308 20:40:13.368204 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:13 crc kubenswrapper[4885]: E0308 20:40:13.368810 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:24 crc kubenswrapper[4885]: I0308 20:40:24.368728 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:24 crc kubenswrapper[4885]: E0308 20:40:24.369822 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:36 crc kubenswrapper[4885]: I0308 20:40:36.369074 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:36 crc kubenswrapper[4885]: E0308 20:40:36.369981 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:51 crc kubenswrapper[4885]: I0308 20:40:51.369452 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:51 crc kubenswrapper[4885]: E0308 20:40:51.370694 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:02 crc kubenswrapper[4885]: I0308 20:41:02.477756 4885 scope.go:117] "RemoveContainer" containerID="4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48" Mar 08 20:41:03 crc kubenswrapper[4885]: I0308 20:41:03.369358 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:03 crc kubenswrapper[4885]: E0308 20:41:03.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:15 crc kubenswrapper[4885]: I0308 20:41:15.377753 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:15 crc kubenswrapper[4885]: E0308 20:41:15.378687 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:29 crc kubenswrapper[4885]: I0308 20:41:29.371671 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:29 crc kubenswrapper[4885]: E0308 20:41:29.372402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:40 crc kubenswrapper[4885]: I0308 20:41:40.368689 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:40 crc kubenswrapper[4885]: E0308 20:41:40.369915 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:52 crc kubenswrapper[4885]: I0308 20:41:52.368419 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:52 crc kubenswrapper[4885]: E0308 20:41:52.369640 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.162662 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:00 crc kubenswrapper[4885]: E0308 20:42:00.163789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.163815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.164143 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.164843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168876 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.181896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.186579 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.284214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.312108 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.503569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.769230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:01 crc kubenswrapper[4885]: I0308 20:42:01.417091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerStarted","Data":"d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb"} Mar 08 20:42:02 crc kubenswrapper[4885]: I0308 20:42:02.426473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerStarted","Data":"f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8"} Mar 08 20:42:02 crc kubenswrapper[4885]: I0308 20:42:02.447804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" podStartSLOduration=1.418156971 podStartE2EDuration="2.447786514s" podCreationTimestamp="2026-03-08 20:42:00 +0000 UTC" firstStartedPulling="2026-03-08 20:42:00.881442401 +0000 UTC m=+4222.277496484" lastFinishedPulling="2026-03-08 20:42:01.911071994 +0000 UTC m=+4223.307126027" observedRunningTime="2026-03-08 20:42:02.44170563 +0000 UTC m=+4223.837759683" watchObservedRunningTime="2026-03-08 20:42:02.447786514 +0000 UTC m=+4223.843840547" Mar 08 20:42:03 crc kubenswrapper[4885]: I0308 20:42:03.437402 4885 generic.go:334] "Generic (PLEG): container finished" podID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerID="f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8" exitCode=0 Mar 08 20:42:03 crc kubenswrapper[4885]: I0308 20:42:03.437542 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerDied","Data":"f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8"} Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.797237 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.857242 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"34acfccb-db62-40e1-b46c-3227ce6e32ab\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.864146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt" (OuterVolumeSpecName: "kube-api-access-7jqgt") pod "34acfccb-db62-40e1-b46c-3227ce6e32ab" (UID: "34acfccb-db62-40e1-b46c-3227ce6e32ab"). InnerVolumeSpecName "kube-api-access-7jqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.960038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") on node \"crc\" DevicePath \"\"" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456736 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerDied","Data":"d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb"} Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456790 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456808 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.533954 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.546071 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:42:06 crc kubenswrapper[4885]: I0308 20:42:06.368326 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:42:07 crc kubenswrapper[4885]: I0308 20:42:07.381177 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" path="/var/lib/kubelet/pods/4ba33559-23ce-4dec-a0fb-d479e47d6f1c/volumes" Mar 08 20:42:07 crc kubenswrapper[4885]: I0308 20:42:07.499702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} Mar 08 20:43:02 crc kubenswrapper[4885]: I0308 20:43:02.869645 4885 scope.go:117] "RemoveContainer" containerID="4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.174403 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:00 crc kubenswrapper[4885]: E0308 20:44:00.175838 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.175872 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.176242 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.177264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.181670 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.182667 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.182966 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.183327 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.332871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.434406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.455945 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.506955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.965842 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:01 crc kubenswrapper[4885]: I0308 20:44:01.582800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerStarted","Data":"2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a"} Mar 08 20:44:02 crc kubenswrapper[4885]: I0308 20:44:02.594123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerStarted","Data":"fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121"} Mar 08 20:44:02 crc kubenswrapper[4885]: I0308 20:44:02.617655 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550044-682qm" podStartSLOduration=1.451696281 podStartE2EDuration="2.617628468s" podCreationTimestamp="2026-03-08 20:44:00 +0000 UTC" firstStartedPulling="2026-03-08 20:44:00.974289652 +0000 UTC m=+4342.370343725" lastFinishedPulling="2026-03-08 20:44:02.140221849 +0000 UTC m=+4343.536275912" observedRunningTime="2026-03-08 20:44:02.616835447 +0000 UTC m=+4344.012889510" watchObservedRunningTime="2026-03-08 20:44:02.617628468 +0000 UTC m=+4344.013682531" Mar 08 20:44:03 crc kubenswrapper[4885]: I0308 20:44:03.604443 4885 generic.go:334] "Generic (PLEG): container finished" podID="8cbaad9b-e652-438c-9b41-f414447382c5" containerID="fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121" exitCode=0 Mar 08 20:44:03 crc kubenswrapper[4885]: I0308 20:44:03.604520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerDied","Data":"fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121"} Mar 08 20:44:04 crc kubenswrapper[4885]: I0308 20:44:04.960828 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.134656 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"8cbaad9b-e652-438c-9b41-f414447382c5\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.142438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw" (OuterVolumeSpecName: "kube-api-access-t98fw") pod "8cbaad9b-e652-438c-9b41-f414447382c5" (UID: "8cbaad9b-e652-438c-9b41-f414447382c5"). InnerVolumeSpecName "kube-api-access-t98fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.236671 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") on node \"crc\" DevicePath \"\"" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerDied","Data":"2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a"} Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626714 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626782 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.720435 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.728652 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:44:07 crc kubenswrapper[4885]: I0308 20:44:07.386183 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" path="/var/lib/kubelet/pods/5d4b8072-1e19-4a26-b038-2a3c6d634760/volumes" Mar 08 20:44:32 crc kubenswrapper[4885]: I0308 20:44:32.818459 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:44:32 crc kubenswrapper[4885]: I0308 20:44:32.821319 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.180896 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:00 crc kubenswrapper[4885]: E0308 20:45:00.182833 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.182873 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.183214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.184139 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.190970 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.191253 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.242656 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.263748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.264031 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.264055 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.367174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.372342 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.393027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.542774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:01 crc kubenswrapper[4885]: I0308 20:45:01.058525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:01 crc kubenswrapper[4885]: I0308 20:45:01.178017 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerStarted","Data":"8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75"} Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.198295 4885 generic.go:334] "Generic (PLEG): container finished" podID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerID="62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a" exitCode=0 Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.198536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerDied","Data":"62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a"} Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.843993 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.844076 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.001597 4885 scope.go:117] "RemoveContainer" containerID="dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.549306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.623708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.623761 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.624026 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.624466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume" (OuterVolumeSpecName: "config-volume") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.629476 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q" (OuterVolumeSpecName: "kube-api-access-j8m8q") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "kube-api-access-j8m8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.630362 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724833 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724869 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerDied","Data":"8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75"} Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218557 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218613 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.651447 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.663282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:45:05 crc kubenswrapper[4885]: I0308 20:45:05.382328 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" path="/var/lib/kubelet/pods/0336d864-07a3-41ed-9327-8a39d16d667f/volumes" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.000381 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: E0308 20:45:32.001529 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.001551 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.001819 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.003672 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.009715 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.115908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.116071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.116277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.219387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.251573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.338878 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.795064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.818872 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.818950 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819011 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819616 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819690 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" gracePeriod=600 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483322 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" exitCode=0 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483729 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487438 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" exitCode=0 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"59098b090e1126b0e7d7b18062b58caa8e180d7c1dec6b100ccbb38d0edcbeef"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.489711 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:45:35 crc kubenswrapper[4885]: I0308 20:45:35.512277 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} Mar 08 20:45:36 crc kubenswrapper[4885]: I0308 20:45:36.523277 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" exitCode=0 Mar 08 20:45:36 crc kubenswrapper[4885]: I0308 20:45:36.523341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} Mar 08 20:45:37 crc kubenswrapper[4885]: I0308 20:45:37.539135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} Mar 08 20:45:37 crc kubenswrapper[4885]: I0308 20:45:37.577747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knn77" podStartSLOduration=3.172405744 podStartE2EDuration="6.577720829s" podCreationTimestamp="2026-03-08 20:45:31 +0000 UTC" firstStartedPulling="2026-03-08 20:45:33.489443474 +0000 UTC m=+4434.885497507" lastFinishedPulling="2026-03-08 20:45:36.894758529 +0000 UTC m=+4438.290812592" observedRunningTime="2026-03-08 20:45:37.566399226 +0000 UTC m=+4438.962453319" watchObservedRunningTime="2026-03-08 20:45:37.577720829 +0000 UTC m=+4438.973774862" Mar 08 20:45:42 crc kubenswrapper[4885]: I0308 20:45:42.339651 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:42 crc kubenswrapper[4885]: I0308 20:45:42.340261 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:43 crc kubenswrapper[4885]: I0308 20:45:43.417180 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knn77" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" probeResult="failure" output=< Mar 08 20:45:43 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:45:43 crc kubenswrapper[4885]: > Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.428593 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.509968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.685978 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:53 crc kubenswrapper[4885]: I0308 20:45:53.701710 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knn77" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" containerID="cri-o://d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" gracePeriod=2 Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.243450 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.433666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.433886 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.434335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.435852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities" (OuterVolumeSpecName: "utilities") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.447161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg" (OuterVolumeSpecName: "kube-api-access-6xsxg") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "kube-api-access-6xsxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.537305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.537353 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.611524 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.639120 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716706 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" exitCode=0 Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716735 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"59098b090e1126b0e7d7b18062b58caa8e180d7c1dec6b100ccbb38d0edcbeef"} Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716892 4885 scope.go:117] "RemoveContainer" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.748812 4885 scope.go:117] "RemoveContainer" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.781200 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.793953 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.798852 4885 scope.go:117] "RemoveContainer" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.832221 4885 scope.go:117] "RemoveContainer" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.832986 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": container with ID starting with d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966 not found: ID does not exist" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833065 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} err="failed to get container status \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": rpc error: code = NotFound desc = could not find container \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": container with ID starting with d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833114 4885 scope.go:117] "RemoveContainer" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.833701 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": container with ID starting with 146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729 not found: ID does not exist" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833776 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} err="failed to get container status \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": rpc error: code = NotFound desc = could not find container \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": container with ID starting with 146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833838 4885 scope.go:117] "RemoveContainer" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.834456 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": container with ID starting with 34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778 not found: ID does not exist" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.834510 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778"} err="failed to get container status \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": rpc error: code = NotFound desc = could not find container \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": container with ID starting with 34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.898211 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899060 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-content" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899097 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-content" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899186 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899202 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899257 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-utilities" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899273 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-utilities" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899851 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.903601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.922833 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946430 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.047987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.048131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.048288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.049160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.049357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.077037 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.223138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.387889 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8135c179-1825-4687-93d5-8498573a991c" path="/var/lib/kubelet/pods/8135c179-1825-4687-93d5-8498573a991c/volumes" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.603350 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:55 crc kubenswrapper[4885]: W0308 20:45:55.616246 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40daff17_4ce3_4cda_844e_8c2690d94d31.slice/crio-07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d WatchSource:0}: Error finding container 07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d: Status 404 returned error can't find the container with id 07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.724054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d"} Mar 08 20:45:56 crc kubenswrapper[4885]: I0308 20:45:56.738841 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" exitCode=0 Mar 08 20:45:56 crc kubenswrapper[4885]: I0308 20:45:56.738915 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e"} Mar 08 20:45:57 crc kubenswrapper[4885]: I0308 20:45:57.750999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} Mar 08 20:45:58 crc kubenswrapper[4885]: I0308 20:45:58.762887 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" exitCode=0 Mar 08 20:45:58 crc kubenswrapper[4885]: I0308 20:45:58.762966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} Mar 08 20:45:59 crc kubenswrapper[4885]: I0308 20:45:59.773706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} Mar 08 20:45:59 crc kubenswrapper[4885]: I0308 20:45:59.806682 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t78sc" podStartSLOduration=3.403241506 podStartE2EDuration="5.80665173s" podCreationTimestamp="2026-03-08 20:45:54 +0000 UTC" firstStartedPulling="2026-03-08 20:45:56.741476089 +0000 UTC m=+4458.137530142" lastFinishedPulling="2026-03-08 20:45:59.144886303 +0000 UTC m=+4460.540940366" observedRunningTime="2026-03-08 20:45:59.802472437 +0000 UTC m=+4461.198526530" watchObservedRunningTime="2026-03-08 20:45:59.80665173 +0000 UTC m=+4461.202705793" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.155386 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.156834 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.159900 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.160014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.161035 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.167448 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.231717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.333577 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.366323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.486327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.994416 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:01 crc kubenswrapper[4885]: W0308 20:46:01.067277 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82d1463_69f5_455a_b2bf_493366c067f7.slice/crio-33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb WatchSource:0}: Error finding container 33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb: Status 404 returned error can't find the container with id 33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb Mar 08 20:46:01 crc kubenswrapper[4885]: I0308 20:46:01.793857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerStarted","Data":"33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb"} Mar 08 20:46:02 crc kubenswrapper[4885]: I0308 20:46:02.806333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerStarted","Data":"28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b"} Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.101693 4885 scope.go:117] "RemoveContainer" containerID="04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4" Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.818712 4885 generic.go:334] "Generic (PLEG): container finished" podID="b82d1463-69f5-455a-b2bf-493366c067f7" containerID="28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b" exitCode=0 Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.818779 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerDied","Data":"28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b"} Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.210896 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.225467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.225533 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.309348 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"b82d1463-69f5-455a-b2bf-493366c067f7\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.318217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96" (OuterVolumeSpecName: "kube-api-access-6nh96") pod "b82d1463-69f5-455a-b2bf-493366c067f7" (UID: "b82d1463-69f5-455a-b2bf-493366c067f7"). InnerVolumeSpecName "kube-api-access-6nh96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.327738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.410782 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.837697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.838134 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerDied","Data":"33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb"} Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.838204 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.887372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.909482 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.921057 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.952469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:07 crc kubenswrapper[4885]: I0308 20:46:07.381530 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" path="/var/lib/kubelet/pods/f06a86c2-3f71-42f0-8f33-558ffba8e527/volumes" Mar 08 20:46:07 crc kubenswrapper[4885]: I0308 20:46:07.856253 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t78sc" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" containerID="cri-o://923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" gracePeriod=2 Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.311062 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452002 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.453900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities" (OuterVolumeSpecName: "utilities") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.459464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr" (OuterVolumeSpecName: "kube-api-access-5jtkr") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "kube-api-access-5jtkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.555418 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.555483 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.803693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.859595 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868598 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" exitCode=0 Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868665 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d"} Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868735 4885 scope.go:117] "RemoveContainer" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868960 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.899904 4885 scope.go:117] "RemoveContainer" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.920279 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.927238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.935254 4885 scope.go:117] "RemoveContainer" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.960522 4885 scope.go:117] "RemoveContainer" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.961186 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": container with ID starting with 923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd not found: ID does not exist" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961228 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} err="failed to get container status \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": rpc error: code = NotFound desc = could not find container \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": container with ID starting with 923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd not found: ID does not exist" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961285 4885 scope.go:117] "RemoveContainer" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.961727 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": container with ID starting with bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813 not found: ID does not exist" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961766 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} err="failed to get container status \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": rpc error: code = NotFound desc = could not find container \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": container with ID starting with bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813 not found: ID does not exist" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961793 4885 scope.go:117] "RemoveContainer" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.962168 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": container with ID starting with 2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e not found: ID does not exist" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.962193 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e"} err="failed to get container status \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": rpc error: code = NotFound desc = could not find container \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": container with ID starting with 2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e not found: ID does not exist" Mar 08 20:46:09 crc kubenswrapper[4885]: I0308 20:46:09.383499 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" path="/var/lib/kubelet/pods/40daff17-4ce3-4cda-844e-8c2690d94d31/volumes" Mar 08 20:47:03 crc kubenswrapper[4885]: I0308 20:47:03.238834 4885 scope.go:117] "RemoveContainer" containerID="cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.448528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.456980 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.543634 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.543948 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-utilities" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.543962 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-utilities" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.543999 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544009 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.544022 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-content" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-content" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.544050 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544058 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544226 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544257 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.547838 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548416 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548475 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.558652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.746189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.764374 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.865612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:28 crc kubenswrapper[4885]: I0308 20:47:28.435811 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:28 crc kubenswrapper[4885]: I0308 20:47:28.713356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerStarted","Data":"017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547"} Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.377542 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" path="/var/lib/kubelet/pods/1a3c0554-f8ec-4a68-a332-1eba738b28c6/volumes" Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.724061 4885 generic.go:334] "Generic (PLEG): container finished" podID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerID="71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d" exitCode=0 Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.724098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerDied","Data":"71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d"} Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.063354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.102251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.107310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82" (OuterVolumeSpecName: "kube-api-access-b5l82") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "kube-api-access-b5l82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.126681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203384 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203439 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203461 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerDied","Data":"017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547"} Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744443 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744511 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.634811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.646012 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.742657 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:33 crc kubenswrapper[4885]: E0308 20:47:33.743487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.743675 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.744243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.747318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.754557 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.754756 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.755301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.755436 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.759890 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843663 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946028 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.947176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.947460 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.981333 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.081684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.575435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.769740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerStarted","Data":"2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a"} Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.379955 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" path="/var/lib/kubelet/pods/92e8ccce-5f4b-4070-a497-9967a01c5897/volumes" Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.778119 4885 generic.go:334] "Generic (PLEG): container finished" podID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerID="c4be26f875d4819cf8da13b6c1d96b277d86ed1ad18bb162bc1f8d6584a5fe61" exitCode=0 Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.778161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerDied","Data":"c4be26f875d4819cf8da13b6c1d96b277d86ed1ad18bb162bc1f8d6584a5fe61"} Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.096900 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.192179 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.197529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh" (OuterVolumeSpecName: "kube-api-access-l2kmh") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "kube-api-access-l2kmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.219846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.293194 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.293244 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.807732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerDied","Data":"2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a"} Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.808078 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.808148 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.175084 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:00 crc kubenswrapper[4885]: E0308 20:48:00.175899 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.175917 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.176142 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.176631 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.179431 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.179604 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.184097 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.202202 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.256151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.356900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.379422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.509775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:01 crc kubenswrapper[4885]: I0308 20:48:01.041150 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.030187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerStarted","Data":"d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee"} Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.818562 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.818891 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.041592 4885 generic.go:334] "Generic (PLEG): container finished" podID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerID="4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6" exitCode=0 Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.041680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerDied","Data":"4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6"} Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.369058 4885 scope.go:117] "RemoveContainer" containerID="b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.444745 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.529471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.549255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97" (OuterVolumeSpecName: "kube-api-access-mmt97") pod "8cc070c5-7e69-4caa-82a5-b21b8fa66256" (UID: "8cc070c5-7e69-4caa-82a5-b21b8fa66256"). InnerVolumeSpecName "kube-api-access-mmt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.631465 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerDied","Data":"d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee"} Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067149 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067426 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.560682 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.570496 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:48:07 crc kubenswrapper[4885]: I0308 20:48:07.385009 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" path="/var/lib/kubelet/pods/34acfccb-db62-40e1-b46c-3227ce6e32ab/volumes" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.570189 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:09 crc kubenswrapper[4885]: E0308 20:48:09.571714 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.571856 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.572263 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.574129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.592449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608131 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608453 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.710378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.710640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711438 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.751208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.918529 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:10 crc kubenswrapper[4885]: I0308 20:48:10.422157 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117737 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" exitCode=0 Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf"} Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerStarted","Data":"4a4290465a756fe6805abd596855031f80b77d73d18b519f278c9aa3f2cd0de5"} Mar 08 20:48:12 crc kubenswrapper[4885]: I0308 20:48:12.127483 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" exitCode=0 Mar 08 20:48:12 crc kubenswrapper[4885]: I0308 20:48:12.127589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda"} Mar 08 20:48:13 crc kubenswrapper[4885]: I0308 20:48:13.138178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerStarted","Data":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} Mar 08 20:48:13 crc kubenswrapper[4885]: I0308 20:48:13.177485 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqdpm" podStartSLOduration=2.74004976 podStartE2EDuration="4.177465642s" podCreationTimestamp="2026-03-08 20:48:09 +0000 UTC" firstStartedPulling="2026-03-08 20:48:11.120070724 +0000 UTC m=+4592.516124787" lastFinishedPulling="2026-03-08 20:48:12.557486606 +0000 UTC m=+4593.953540669" observedRunningTime="2026-03-08 20:48:13.172649452 +0000 UTC m=+4594.568703505" watchObservedRunningTime="2026-03-08 20:48:13.177465642 +0000 UTC m=+4594.573519675" Mar 08 20:48:19 crc kubenswrapper[4885]: I0308 20:48:19.919539 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:19 crc kubenswrapper[4885]: I0308 20:48:19.920295 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.006393 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.274085 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.340475 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.218962 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqdpm" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" containerID="cri-o://baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" gracePeriod=2 Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.646423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817427 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.818422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities" (OuterVolumeSpecName: "utilities") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.822533 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj" (OuterVolumeSpecName: "kube-api-access-wtnnj") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "kube-api-access-wtnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.852053 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919762 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919815 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919841 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231429 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" exitCode=0 Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231520 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"4a4290465a756fe6805abd596855031f80b77d73d18b519f278c9aa3f2cd0de5"} Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231805 4885 scope.go:117] "RemoveContainer" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.259912 4885 scope.go:117] "RemoveContainer" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.291333 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.297233 4885 scope.go:117] "RemoveContainer" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.301494 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.328537 4885 scope.go:117] "RemoveContainer" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.329124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": container with ID starting with baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a not found: ID does not exist" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329178 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} err="failed to get container status \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": rpc error: code = NotFound desc = could not find container \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": container with ID starting with baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329214 4885 scope.go:117] "RemoveContainer" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.329558 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": container with ID starting with 812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda not found: ID does not exist" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329595 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda"} err="failed to get container status \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": rpc error: code = NotFound desc = could not find container \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": container with ID starting with 812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329619 4885 scope.go:117] "RemoveContainer" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.330033 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": container with ID starting with d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf not found: ID does not exist" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.330097 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf"} err="failed to get container status \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": rpc error: code = NotFound desc = could not find container \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": container with ID starting with d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.387048 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" path="/var/lib/kubelet/pods/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2/volumes" Mar 08 20:48:32 crc kubenswrapper[4885]: I0308 20:48:32.819111 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:48:32 crc kubenswrapper[4885]: I0308 20:48:32.819816 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.818690 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.819587 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.819656 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.820600 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.820705 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" gracePeriod=600 Mar 08 20:49:02 crc kubenswrapper[4885]: E0308 20:49:02.955956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.440785 4885 scope.go:117] "RemoveContainer" containerID="f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638461 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" exitCode=0 Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638643 4885 scope.go:117] "RemoveContainer" containerID="e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.639799 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:03 crc kubenswrapper[4885]: E0308 20:49:03.641862 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:17 crc kubenswrapper[4885]: I0308 20:49:17.368914 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:17 crc kubenswrapper[4885]: E0308 20:49:17.370084 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:29 crc kubenswrapper[4885]: I0308 20:49:29.375306 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:29 crc kubenswrapper[4885]: E0308 20:49:29.376426 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:44 crc kubenswrapper[4885]: I0308 20:49:44.367708 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:44 crc kubenswrapper[4885]: E0308 20:49:44.368402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:59 crc kubenswrapper[4885]: I0308 20:49:59.376649 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:59 crc kubenswrapper[4885]: E0308 20:49:59.377634 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.156121 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157038 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-utilities" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157074 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-utilities" Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157110 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157126 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157175 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-content" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157193 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-content" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157525 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.158301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.160722 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.163608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.172654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.191282 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.191651 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.265108 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.297360 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.520151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:01 crc kubenswrapper[4885]: I0308 20:50:01.025783 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:01 crc kubenswrapper[4885]: I0308 20:50:01.191136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerStarted","Data":"bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f"} Mar 08 20:50:03 crc kubenswrapper[4885]: I0308 20:50:03.210750 4885 generic.go:334] "Generic (PLEG): container finished" podID="c364e354-f542-45ec-9322-125db18eb928" containerID="00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87" exitCode=0 Mar 08 20:50:03 crc kubenswrapper[4885]: I0308 20:50:03.210869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerDied","Data":"00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87"} Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.626643 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.742216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"c364e354-f542-45ec-9322-125db18eb928\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.750350 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz" (OuterVolumeSpecName: "kube-api-access-wwmsz") pod "c364e354-f542-45ec-9322-125db18eb928" (UID: "c364e354-f542-45ec-9322-125db18eb928"). InnerVolumeSpecName "kube-api-access-wwmsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.844163 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerDied","Data":"bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f"} Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239660 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239652 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.717777 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.727441 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:50:07 crc kubenswrapper[4885]: I0308 20:50:07.382464 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" path="/var/lib/kubelet/pods/8cbaad9b-e652-438c-9b41-f414447382c5/volumes" Mar 08 20:50:12 crc kubenswrapper[4885]: I0308 20:50:12.367728 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:12 crc kubenswrapper[4885]: E0308 20:50:12.368506 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:27 crc kubenswrapper[4885]: I0308 20:50:27.369696 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:27 crc kubenswrapper[4885]: E0308 20:50:27.379067 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:42 crc kubenswrapper[4885]: I0308 20:50:42.368293 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:42 crc kubenswrapper[4885]: E0308 20:50:42.369432 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.357298 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.358204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.358222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.358368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.359147 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.363642 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.363897 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jtnpr" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.364168 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.367164 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.367985 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.368234 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.384703 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.410294 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.411388 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.414086 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.431100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.470632 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.471066 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-qk4ff], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" podUID="d7da12c3-52d7-40e9-b7ce-8140c98dc55d" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.492465 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.493522 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.517354 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.551872 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664484 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664520 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664540 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665571 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665970 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.683685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.703717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.706690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.738412 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.738890 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.758559 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.759718 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.772254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.870645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.870934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.893511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.974942 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.054749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.073621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.131227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178261 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.179039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config" (OuterVolumeSpecName: "config") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.179339 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.183096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff" (OuterVolumeSpecName: "kube-api-access-qk4ff") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "kube-api-access-qk4ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280694 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280725 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280736 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.301508 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.457101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:55 crc kubenswrapper[4885]: W0308 20:50:55.459131 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc958bb_3dbf_45b1_be56_8fc362a957a5.slice/crio-b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf WatchSource:0}: Error finding container b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf: Status 404 returned error can't find the container with id b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.605627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.607224 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.609857 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.610637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.611014 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.612299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.612755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lp8r9" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.618076 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.661976 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:55 crc kubenswrapper[4885]: W0308 20:50:55.672364 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7084d0a7_6526_4b90_93a3_e16b9d374be2.slice/crio-cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5 WatchSource:0}: Error finding container cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5: Status 404 returned error can't find the container with id cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5 Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785052 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.858838 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.859940 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.862907 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.862944 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863045 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863986 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qk9sg" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.881875 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886881 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.887744 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.887992 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.888183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.888500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905973 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.910325 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.910363 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c3a41ce06382fbd79ab5375af77b75afb5583c9d1133ff7b974e34c8338b5b2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.913473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.965420 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987755 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987806 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987892 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987979 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987997 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061009 4885 generic.go:334] "Generic (PLEG): container finished" podID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerStarted","Data":"cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062505 4885 generic.go:334] "Generic (PLEG): container finished" podID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerID="3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerDied","Data":"3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerStarted","Data":"a27c7d52fd6267f66531fa416c4e6953885eaf21e7fea63ba5bad10bb9c6a4e5"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.067045 4885 generic.go:334] "Generic (PLEG): container finished" podID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.067129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.068189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.068220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerStarted","Data":"b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089551 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089612 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.090322 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.090709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.091040 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.093528 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097874 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097940 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c9d1ef4ae2d748771a4a21e0f3a78df8503e1290141efbfe3cef4df6cc2ca2a/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.103525 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.109557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.129234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.174601 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.179800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.227290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.373445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.473860 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.507113 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.507982 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.508079 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.512595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt" (OuterVolumeSpecName: "kube-api-access-vv7gt") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "kube-api-access-vv7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.524261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.529586 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config" (OuterVolumeSpecName: "config") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610574 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610899 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610962 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.637177 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:56 crc kubenswrapper[4885]: W0308 20:50:56.642651 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3c4c5d_5d8f_4011_9ab6_94c44dfc2872.slice/crio-e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7 WatchSource:0}: Error finding container e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7: Status 404 returned error can't find the container with id e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.932132 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.016394 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: E0308 20:50:57.017275 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.017306 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.017692 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.019169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.023173 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.023721 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4vqmw" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.024455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.024192 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.031787 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.031841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.075184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerStarted","Data":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.076220 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.077570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.080063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerStarted","Data":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.080223 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.081417 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerDied","Data":"a27c7d52fd6267f66531fa416c4e6953885eaf21e7fea63ba5bad10bb9c6a4e5"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.081464 4885 scope.go:117] "RemoveContainer" containerID="3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.082392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.082838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"348bdcbe9fe6c9c9545000fb7ce8f20f391b324558bd8ce452114903f6c00551"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.095261 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c44667757-cn45b" podStartSLOduration=3.095245305 podStartE2EDuration="3.095245305s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:57.087771426 +0000 UTC m=+4758.483825459" watchObservedRunningTime="2026-03-08 20:50:57.095245305 +0000 UTC m=+4758.491299338" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.112628 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" podStartSLOduration=3.112610759 podStartE2EDuration="3.112610759s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:57.105933231 +0000 UTC m=+4758.501987254" watchObservedRunningTime="2026-03-08 20:50:57.112610759 +0000 UTC m=+4758.508664792" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119609 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119685 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.150673 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.158567 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221284 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221329 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221464 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.222815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.222991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.223232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.223443 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.226485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.228903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.234427 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.234474 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/91cdede2446772955a07df46284552052b9252ce73facbf1f7a38d7c6d0d6763/globalmount\"" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.239532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.264824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.359014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.379286 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7da12c3-52d7-40e9-b7ce-8140c98dc55d" path="/var/lib/kubelet/pods/d7da12c3-52d7-40e9-b7ce-8140c98dc55d/volumes" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.379743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" path="/var/lib/kubelet/pods/e057e697-8dde-4c0b-80f0-7c3a81c8bca0/volumes" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.381831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.382875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.385489 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.385751 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qv44v" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.391653 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526551 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.629102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.632496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.660099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.748951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.870365 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.059158 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.119138 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820","Type":"ContainerStarted","Data":"771327ad1f03a668ab2f0f78172055e3b00ae9001ebc750840fa50e53e070890"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.142562 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.144189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.146143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"6f7078c8a1a13669613cf0c66f3e7c89c5596c34cb5f4ac3c549d6634c97d9ef"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.544402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.546440 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.549006 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.549614 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.550543 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.551975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mrvsj" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.569962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.650528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651190 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651306 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651806 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753570 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.755899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.757072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.757195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.758695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761033 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761154 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761201 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c4b9c1ff5d2434b7037bc43fb21fb2c8a0d4afe3004f6980f97a489a205f8d37/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.763329 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.779993 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.808093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.910338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.158761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820","Type":"ContainerStarted","Data":"1093ea04c47ee91a81dbc795ff87234498e2aa142aba1fb2808cf80f06a787cc"} Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.158881 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.162517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d"} Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.201617 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.201593352 podStartE2EDuration="2.201593352s" podCreationTimestamp="2026-03-08 20:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:59.195421946 +0000 UTC m=+4760.591476039" watchObservedRunningTime="2026-03-08 20:50:59.201593352 +0000 UTC m=+4760.597647375" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.209749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:59 crc kubenswrapper[4885]: W0308 20:50:59.211433 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31392f16_4aaa_4512_982e_0c56d9af8200.slice/crio-1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1 WatchSource:0}: Error finding container 1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1: Status 404 returned error can't find the container with id 1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1 Mar 08 20:51:00 crc kubenswrapper[4885]: I0308 20:51:00.170023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae"} Mar 08 20:51:00 crc kubenswrapper[4885]: I0308 20:51:00.171498 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1"} Mar 08 20:51:02 crc kubenswrapper[4885]: I0308 20:51:02.195547 4885 generic.go:334] "Generic (PLEG): container finished" podID="1fcbde6c-f104-4c3b-9937-24728ac572a8" containerID="1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d" exitCode=0 Mar 08 20:51:02 crc kubenswrapper[4885]: I0308 20:51:02.195676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerDied","Data":"1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.205318 4885 generic.go:334] "Generic (PLEG): container finished" podID="31392f16-4aaa-4512-982e-0c56d9af8200" containerID="3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae" exitCode=0 Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.205427 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerDied","Data":"3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.209244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"a63c0dad7bb6aa0ed2276ddf85ccb822aed5a56c6e68d2083912752d804a9ed3"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.261464 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.261445067 podStartE2EDuration="8.261445067s" podCreationTimestamp="2026-03-08 20:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:03.257193543 +0000 UTC m=+4764.653247596" watchObservedRunningTime="2026-03-08 20:51:03.261445067 +0000 UTC m=+4764.657499100" Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.597067 4885 scope.go:117] "RemoveContainer" containerID="fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121" Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.222259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"e9c3869c867e654ad6db645ee2d2b003762a2f82a7f211fa0201f8897d528392"} Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.256287 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.256269864 podStartE2EDuration="7.256269864s" podCreationTimestamp="2026-03-08 20:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:04.249077843 +0000 UTC m=+4765.645131866" watchObservedRunningTime="2026-03-08 20:51:04.256269864 +0000 UTC m=+4765.652323887" Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.977405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.134394 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.204278 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.235068 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c44667757-cn45b" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" containerID="cri-o://36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" gracePeriod=10 Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.644373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.770966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.771161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.778131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p" (OuterVolumeSpecName: "kube-api-access-qnx2p") pod "2cc958bb-3dbf-45b1-be56-8fc362a957a5" (UID: "2cc958bb-3dbf-45b1-be56-8fc362a957a5"). InnerVolumeSpecName "kube-api-access-qnx2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.812270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config" (OuterVolumeSpecName: "config") pod "2cc958bb-3dbf-45b1-be56-8fc362a957a5" (UID: "2cc958bb-3dbf-45b1-be56-8fc362a957a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.873112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.873151 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248711 4885 generic.go:334] "Generic (PLEG): container finished" podID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" exitCode=0 Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.250539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf"} Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.250581 4885 scope.go:117] "RemoveContainer" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.284187 4885 scope.go:117] "RemoveContainer" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.321413 4885 scope.go:117] "RemoveContainer" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: E0308 20:51:06.322084 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": container with ID starting with 36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1 not found: ID does not exist" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322139 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} err="failed to get container status \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": rpc error: code = NotFound desc = could not find container \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": container with ID starting with 36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1 not found: ID does not exist" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322176 4885 scope.go:117] "RemoveContainer" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: E0308 20:51:06.322662 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": container with ID starting with 03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7 not found: ID does not exist" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322727 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7"} err="failed to get container status \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": rpc error: code = NotFound desc = could not find container \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": container with ID starting with 03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7 not found: ID does not exist" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.325662 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.334201 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.360049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.360336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.368731 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:07 crc kubenswrapper[4885]: E0308 20:51:07.369019 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.380842 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" path="/var/lib/kubelet/pods/2cc958bb-3dbf-45b1-be56-8fc362a957a5/volumes" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.452209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.749750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.492636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.912391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.912442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:11 crc kubenswrapper[4885]: I0308 20:51:11.393695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:11 crc kubenswrapper[4885]: I0308 20:51:11.521184 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.039026 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:16 crc kubenswrapper[4885]: E0308 20:51:16.040613 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.040644 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: E0308 20:51:16.040804 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="init" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.040817 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="init" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.041378 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.042319 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.051165 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.057360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.157962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.158099 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.259698 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.259895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.261304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.300858 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.390739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.805966 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345533 4885 generic.go:334] "Generic (PLEG): container finished" podID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerID="320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef" exitCode=0 Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345583 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerDied","Data":"320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef"} Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345613 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerStarted","Data":"2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee"} Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.750915 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.802268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.802388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.803376 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bf35f28-81c5-4f7c-9048-212d8899b7e7" (UID: "7bf35f28-81c5-4f7c-9048-212d8899b7e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.812890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6" (OuterVolumeSpecName: "kube-api-access-2gkv6") pod "7bf35f28-81c5-4f7c-9048-212d8899b7e7" (UID: "7bf35f28-81c5-4f7c-9048-212d8899b7e7"). InnerVolumeSpecName "kube-api-access-2gkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.904067 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.904119 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.367008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.375321 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:19 crc kubenswrapper[4885]: E0308 20:51:19.375742 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.384204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerDied","Data":"2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee"} Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.384254 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee" Mar 08 20:51:22 crc kubenswrapper[4885]: I0308 20:51:22.559390 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:22 crc kubenswrapper[4885]: I0308 20:51:22.570025 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:23 crc kubenswrapper[4885]: I0308 20:51:23.383226 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" path="/var/lib/kubelet/pods/7bf35f28-81c5-4f7c-9048-212d8899b7e7/volumes" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.564434 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:27 crc kubenswrapper[4885]: E0308 20:51:27.565152 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565169 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565350 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.568320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.590646 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.740486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.740727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.842555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.842708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.844113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.878629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.897499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:28 crc kubenswrapper[4885]: I0308 20:51:28.451366 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474416 4885 generic.go:334] "Generic (PLEG): container finished" podID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerID="d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867" exitCode=0 Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerDied","Data":"d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867"} Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474829 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerStarted","Data":"fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210"} Mar 08 20:51:30 crc kubenswrapper[4885]: E0308 20:51:30.022471 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17485fbe_1c6a_4d1b_91b3_c465215cb4be.slice/crio-conmon-665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17485fbe_1c6a_4d1b_91b3_c465215cb4be.slice/crio-665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.368485 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:30 crc kubenswrapper[4885]: E0308 20:51:30.368852 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.489707 4885 generic.go:334] "Generic (PLEG): container finished" podID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" exitCode=0 Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.489798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.492504 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" exitCode=0 Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.492619 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.857708 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.998546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.998613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.999144 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dd375ae-21ae-4fb9-87dc-f6a1a205736f" (UID: "1dd375ae-21ae-4fb9-87dc-f6a1a205736f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.002722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2" (OuterVolumeSpecName: "kube-api-access-542x2") pod "1dd375ae-21ae-4fb9-87dc-f6a1a205736f" (UID: "1dd375ae-21ae-4fb9-87dc-f6a1a205736f"). InnerVolumeSpecName "kube-api-access-542x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.099835 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.099865 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerDied","Data":"fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504169 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504187 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.507576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.508722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.513395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.513985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.538566 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.538545444 podStartE2EDuration="37.538545444s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:31.53350196 +0000 UTC m=+4792.929555993" watchObservedRunningTime="2026-03-08 20:51:31.538545444 +0000 UTC m=+4792.934599467" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.560768 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.560750867 podStartE2EDuration="37.560750867s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:31.554413397 +0000 UTC m=+4792.950467460" watchObservedRunningTime="2026-03-08 20:51:31.560750867 +0000 UTC m=+4792.956804890" Mar 08 20:51:42 crc kubenswrapper[4885]: I0308 20:51:42.368581 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:42 crc kubenswrapper[4885]: E0308 20:51:42.369498 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:46 crc kubenswrapper[4885]: I0308 20:51:46.232205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:51:46 crc kubenswrapper[4885]: I0308 20:51:46.478269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.358993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:51 crc kubenswrapper[4885]: E0308 20:51:51.359735 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.359752 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.359902 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.360600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.382252 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488121 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488716 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.590187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.591135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.591393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.592125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.592198 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.631824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.680740 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.144984 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.175443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705345 4885 generic.go:334] "Generic (PLEG): container finished" podID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" exitCode=0 Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43"} Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerStarted","Data":"e38cb66edab29ad6cc751ddfec546724fea6786800c820b585d732bbfd60f672"} Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.917692 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.717729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerStarted","Data":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.718858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.747785 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" podStartSLOduration=2.747769773 podStartE2EDuration="2.747769773s" podCreationTimestamp="2026-03-08 20:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:53.743671483 +0000 UTC m=+4815.139725506" watchObservedRunningTime="2026-03-08 20:51:53.747769773 +0000 UTC m=+4815.143823796" Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.864361 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" containerID="cri-o://075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" gracePeriod=604799 Mar 08 20:51:54 crc kubenswrapper[4885]: I0308 20:51:54.755427 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" containerID="cri-o://70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" gracePeriod=604799 Mar 08 20:51:56 crc kubenswrapper[4885]: I0308 20:51:56.228581 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.22:5672: connect: connection refused" Mar 08 20:51:56 crc kubenswrapper[4885]: I0308 20:51:56.475213 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.23:5672: connect: connection refused" Mar 08 20:51:57 crc kubenswrapper[4885]: I0308 20:51:57.370532 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:57 crc kubenswrapper[4885]: E0308 20:51:57.371513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.143618 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.146551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.156798 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.158472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.158849 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.159157 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.252732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.354192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.387149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.474434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.485632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557850 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557995 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558057 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558197 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.563397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.573683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info" (OuterVolumeSpecName: "pod-info") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.573784 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.574057 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.574232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.584172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l" (OuterVolumeSpecName: "kube-api-access-f4w7l") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "kube-api-access-f4w7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.602513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf" (OuterVolumeSpecName: "server-conf") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.610352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa" (OuterVolumeSpecName: "persistence") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659948 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659986 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659998 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660010 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660023 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660058 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") on node \"crc\" " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660074 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660086 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.675461 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.675691 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa") on node "crc" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.691185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.760851 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.761309 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779581 4885 generic.go:334] "Generic (PLEG): container finished" podID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" exitCode=0 Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779626 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779659 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"348bdcbe9fe6c9c9545000fb7ce8f20f391b324558bd8ce452114903f6c00551"} Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779678 4885 scope.go:117] "RemoveContainer" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779817 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.822065 4885 scope.go:117] "RemoveContainer" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.823053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.828350 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850179 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.850543 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850562 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.850587 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="setup-container" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850594 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="setup-container" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850756 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.851546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859695 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859882 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.860165 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.860309 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.862275 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qk9sg" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.867784 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.869617 4885 scope.go:117] "RemoveContainer" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.870247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": container with ID starting with 075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c not found: ID does not exist" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870273 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} err="failed to get container status \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": rpc error: code = NotFound desc = could not find container \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": container with ID starting with 075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c not found: ID does not exist" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870292 4885 scope.go:117] "RemoveContainer" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.870559 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": container with ID starting with 665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea not found: ID does not exist" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870580 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} err="failed to get container status \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": rpc error: code = NotFound desc = could not find container \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": container with ID starting with 665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea not found: ID does not exist" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.879219 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970322 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970396 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970424 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072223 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072253 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072299 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.073147 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.078493 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.079087 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.079630 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.081537 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.082300 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.082350 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c9d1ef4ae2d748771a4a21e0f3a78df8503e1290141efbfe3cef4df6cc2ca2a/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.091903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.106346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.126232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.180199 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.317111 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.380107 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" path="/var/lib/kubelet/pods/17485fbe-1c6a-4d1b-91b3-c465215cb4be/volumes" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479860 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479882 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479938 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479978 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480051 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480181 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480485 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.481464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.486374 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.486584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.489192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc" (OuterVolumeSpecName: "kube-api-access-l2lfc") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "kube-api-access-l2lfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.493572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c" (OuterVolumeSpecName: "persistence") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.507506 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.563643 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.581742 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582015 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582109 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582185 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582486 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582578 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582669 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582832 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") on node \"crc\" " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.599540 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.599739 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c") on node "crc" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.667599 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: W0308 20:52:01.678363 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00704ff6_696f_4687_99e0_23bf055d1bef.slice/crio-d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5 WatchSource:0}: Error finding container d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5: Status 404 returned error can't find the container with id d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.682095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.683895 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.756336 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.756627 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" containerID="cri-o://7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" gracePeriod=10 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803332 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" exitCode=0 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803592 4885 scope.go:117] "RemoveContainer" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.819382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.821015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerStarted","Data":"8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.894180 4885 scope.go:117] "RemoveContainer" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.918341 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.952713 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987078 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.987763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="setup-container" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987780 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="setup-container" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.987796 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987804 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.988113 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.990399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995589 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995968 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995768 4885 scope.go:117] "RemoveContainer" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.996577 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": container with ID starting with 70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5 not found: ID does not exist" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996613 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} err="failed to get container status \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": rpc error: code = NotFound desc = could not find container \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": container with ID starting with 70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5 not found: ID does not exist" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996640 4885 scope.go:117] "RemoveContainer" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.996875 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": container with ID starting with 7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873 not found: ID does not exist" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996897 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} err="failed to get container status \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": rpc error: code = NotFound desc = could not find container \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": container with ID starting with 7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873 not found: ID does not exist" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.997407 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.997783 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.999598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lp8r9" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.001253 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099054 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099345 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200496 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200519 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200641 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.201539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207687 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207822 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207871 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c3a41ce06382fbd79ab5375af77b75afb5583c9d1133ff7b974e34c8338b5b2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.219048 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.219839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.247532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.333270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.597658 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:52:02 crc kubenswrapper[4885]: W0308 20:52:02.650092 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d39294_b81d_4534_b86a_35a3aea74ed7.slice/crio-41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8 WatchSource:0}: Error finding container 41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8: Status 404 returned error can't find the container with id 41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.653556 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708570 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.713212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t" (OuterVolumeSpecName: "kube-api-access-7z56t") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "kube-api-access-7z56t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.747799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.748970 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config" (OuterVolumeSpecName: "config") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810466 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810506 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810524 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.829803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831705 4885 generic.go:334] "Generic (PLEG): container finished" podID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" exitCode=0 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831848 4885 scope.go:117] "RemoveContainer" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.832175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.837478 4885 generic.go:334] "Generic (PLEG): container finished" podID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerID="4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732" exitCode=0 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.837872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerDied","Data":"4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.858639 4885 scope.go:117] "RemoveContainer" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.888837 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.898165 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902053 4885 scope.go:117] "RemoveContainer" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: E0308 20:52:02.902558 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": container with ID starting with 7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899 not found: ID does not exist" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902598 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} err="failed to get container status \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": rpc error: code = NotFound desc = could not find container \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": container with ID starting with 7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899 not found: ID does not exist" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902624 4885 scope.go:117] "RemoveContainer" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: E0308 20:52:02.902980 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": container with ID starting with 4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd not found: ID does not exist" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.903045 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd"} err="failed to get container status \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": rpc error: code = NotFound desc = could not find container \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": container with ID starting with 4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd not found: ID does not exist" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.383777 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" path="/var/lib/kubelet/pods/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872/volumes" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.385568 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" path="/var/lib/kubelet/pods/7084d0a7-6526-4b90-93a3-e16b9d374be2/volumes" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.848025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb"} Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.391530 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.441716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.448899 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm" (OuterVolumeSpecName: "kube-api-access-2ljxm") pod "2414c1e7-ce59-4c76-865d-1a5ffa71578f" (UID: "2414c1e7-ce59-4c76-865d-1a5ffa71578f"). InnerVolumeSpecName "kube-api-access-2ljxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.543489 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860541 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerDied","Data":"8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133"} Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860566 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860590 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.863918 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4"} Mar 08 20:52:05 crc kubenswrapper[4885]: I0308 20:52:05.488113 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:52:05 crc kubenswrapper[4885]: I0308 20:52:05.516643 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:52:07 crc kubenswrapper[4885]: I0308 20:52:07.381827 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" path="/var/lib/kubelet/pods/b82d1463-69f5-455a-b2bf-493366c067f7/volumes" Mar 08 20:52:12 crc kubenswrapper[4885]: I0308 20:52:12.368831 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:12 crc kubenswrapper[4885]: E0308 20:52:12.369984 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:24 crc kubenswrapper[4885]: I0308 20:52:24.368062 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:24 crc kubenswrapper[4885]: E0308 20:52:24.370391 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:36 crc kubenswrapper[4885]: I0308 20:52:36.144008 4885 generic.go:334] "Generic (PLEG): container finished" podID="00704ff6-696f-4687-99e0-23bf055d1bef" containerID="a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb" exitCode=0 Mar 08 20:52:36 crc kubenswrapper[4885]: I0308 20:52:36.144139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerDied","Data":"a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb"} Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.159161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"8a643cc17ac5b6dd9664f5ea40e5d3c3347d697976b4cea9ad7ce397ae21f250"} Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.159745 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.203409 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.203382433 podStartE2EDuration="37.203382433s" podCreationTimestamp="2026-03-08 20:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:52:37.193605222 +0000 UTC m=+4858.589659285" watchObservedRunningTime="2026-03-08 20:52:37.203382433 +0000 UTC m=+4858.599436466" Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.170812 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0d39294-b81d-4534-b86a-35a3aea74ed7" containerID="3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4" exitCode=0 Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.170867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerDied","Data":"3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4"} Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.368150 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:38 crc kubenswrapper[4885]: E0308 20:52:38.368855 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.183167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"de1fa4e51a74fae8244eedc279b615a73ac695d53a2de64cd8ef58a0ec3c637e"} Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.183517 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.220549 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.220515019 podStartE2EDuration="38.220515019s" podCreationTimestamp="2026-03-08 20:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:52:39.212594179 +0000 UTC m=+4860.608648292" watchObservedRunningTime="2026-03-08 20:52:39.220515019 +0000 UTC m=+4860.616569092" Mar 08 20:52:51 crc kubenswrapper[4885]: I0308 20:52:51.185218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 20:52:52 crc kubenswrapper[4885]: I0308 20:52:52.336146 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:52 crc kubenswrapper[4885]: I0308 20:52:52.372559 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:52 crc kubenswrapper[4885]: E0308 20:52:52.374975 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.296540 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298262 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="init" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298288 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="init" Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298323 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298335 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298360 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298375 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298648 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298676 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.299560 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.303079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xp44w" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.311258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.401830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.504134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.541586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.636996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:00 crc kubenswrapper[4885]: I0308 20:53:00.302895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:00 crc kubenswrapper[4885]: I0308 20:53:00.379214 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerStarted","Data":"33714966dcd5a2a47e2d73b61ab2fcdb64486f77dd8de4ec2d7f86a9e962ec18"} Mar 08 20:53:01 crc kubenswrapper[4885]: I0308 20:53:01.393656 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerStarted","Data":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} Mar 08 20:53:01 crc kubenswrapper[4885]: I0308 20:53:01.426306 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.924029003 podStartE2EDuration="2.426274505s" podCreationTimestamp="2026-03-08 20:52:59 +0000 UTC" firstStartedPulling="2026-03-08 20:53:00.320367736 +0000 UTC m=+4881.716421799" lastFinishedPulling="2026-03-08 20:53:00.822613238 +0000 UTC m=+4882.218667301" observedRunningTime="2026-03-08 20:53:01.417580503 +0000 UTC m=+4882.813634566" watchObservedRunningTime="2026-03-08 20:53:01.426274505 +0000 UTC m=+4882.822328538" Mar 08 20:53:03 crc kubenswrapper[4885]: E0308 20:53:03.374693 4885 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.80:36730->38.102.83.80:33667: read tcp 38.102.83.80:36730->38.102.83.80:33667: read: connection reset by peer Mar 08 20:53:03 crc kubenswrapper[4885]: I0308 20:53:03.813358 4885 scope.go:117] "RemoveContainer" containerID="28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b" Mar 08 20:53:07 crc kubenswrapper[4885]: I0308 20:53:07.367513 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:07 crc kubenswrapper[4885]: E0308 20:53:07.368175 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:16 crc kubenswrapper[4885]: I0308 20:53:16.466731 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:16 crc kubenswrapper[4885]: I0308 20:53:16.467651 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" containerID="cri-o://3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" gracePeriod=30 Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.063265 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.217285 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"b4e6b48b-506c-43ea-8933-c1b68f82790d\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.224171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq" (OuterVolumeSpecName: "kube-api-access-k68tq") pod "b4e6b48b-506c-43ea-8933-c1b68f82790d" (UID: "b4e6b48b-506c-43ea-8933-c1b68f82790d"). InnerVolumeSpecName "kube-api-access-k68tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.318757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549093 4885 generic.go:334] "Generic (PLEG): container finished" podID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" exitCode=143 Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerDied","Data":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549224 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549771 4885 scope.go:117] "RemoveContainer" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.550419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerDied","Data":"33714966dcd5a2a47e2d73b61ab2fcdb64486f77dd8de4ec2d7f86a9e962ec18"} Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.577577 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.577825 4885 scope.go:117] "RemoveContainer" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: E0308 20:53:17.578507 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": container with ID starting with 3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a not found: ID does not exist" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.578556 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} err="failed to get container status \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": rpc error: code = NotFound desc = could not find container \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": container with ID starting with 3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a not found: ID does not exist" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.583996 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:19 crc kubenswrapper[4885]: I0308 20:53:19.382786 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" path="/var/lib/kubelet/pods/b4e6b48b-506c-43ea-8933-c1b68f82790d/volumes" Mar 08 20:53:20 crc kubenswrapper[4885]: I0308 20:53:20.368836 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:20 crc kubenswrapper[4885]: E0308 20:53:20.369520 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.673587 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:24 crc kubenswrapper[4885]: E0308 20:53:24.674759 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.674781 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.675077 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.678486 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.702182 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.850524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.850974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.851118 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.953032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.953166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.975811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.009886 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.533994 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.636644 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"31daaecae7e860fcf9c1efb90c24db9d8ac1cd2c562f0d39c509abfb2ef1b691"} Mar 08 20:53:26 crc kubenswrapper[4885]: I0308 20:53:26.650402 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" exitCode=0 Mar 08 20:53:26 crc kubenswrapper[4885]: I0308 20:53:26.650474 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2"} Mar 08 20:53:27 crc kubenswrapper[4885]: I0308 20:53:27.659005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} Mar 08 20:53:28 crc kubenswrapper[4885]: I0308 20:53:28.671726 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" exitCode=0 Mar 08 20:53:28 crc kubenswrapper[4885]: I0308 20:53:28.671831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} Mar 08 20:53:29 crc kubenswrapper[4885]: I0308 20:53:29.685963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} Mar 08 20:53:29 crc kubenswrapper[4885]: I0308 20:53:29.717653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twqvq" podStartSLOduration=3.268438641 podStartE2EDuration="5.71762489s" podCreationTimestamp="2026-03-08 20:53:24 +0000 UTC" firstStartedPulling="2026-03-08 20:53:26.653287789 +0000 UTC m=+4908.049341842" lastFinishedPulling="2026-03-08 20:53:29.102474028 +0000 UTC m=+4910.498528091" observedRunningTime="2026-03-08 20:53:29.709558995 +0000 UTC m=+4911.105613028" watchObservedRunningTime="2026-03-08 20:53:29.71762489 +0000 UTC m=+4911.113678943" Mar 08 20:53:32 crc kubenswrapper[4885]: I0308 20:53:32.367886 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:32 crc kubenswrapper[4885]: E0308 20:53:32.368909 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.010759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.011231 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.071910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.811705 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.875899 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:37 crc kubenswrapper[4885]: I0308 20:53:37.771421 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twqvq" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" containerID="cri-o://41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" gracePeriod=2 Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.743091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782328 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" exitCode=0 Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782388 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782427 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"31daaecae7e860fcf9c1efb90c24db9d8ac1cd2c562f0d39c509abfb2ef1b691"} Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782454 4885 scope.go:117] "RemoveContainer" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782635 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.792554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.792937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.793014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.794145 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities" (OuterVolumeSpecName: "utilities") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.798344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk" (OuterVolumeSpecName: "kube-api-access-89zjk") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "kube-api-access-89zjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.806339 4885 scope.go:117] "RemoveContainer" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.850302 4885 scope.go:117] "RemoveContainer" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.859491 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885023 4885 scope.go:117] "RemoveContainer" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.885385 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": container with ID starting with 41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f not found: ID does not exist" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885431 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} err="failed to get container status \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": rpc error: code = NotFound desc = could not find container \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": container with ID starting with 41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885448 4885 scope.go:117] "RemoveContainer" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.885655 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": container with ID starting with 29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c not found: ID does not exist" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885681 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} err="failed to get container status \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": rpc error: code = NotFound desc = could not find container \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": container with ID starting with 29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885692 4885 scope.go:117] "RemoveContainer" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.886058 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": container with ID starting with 4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2 not found: ID does not exist" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.886128 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2"} err="failed to get container status \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": rpc error: code = NotFound desc = could not find container \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": container with ID starting with 4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2 not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894829 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894881 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894903 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.135528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.146040 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.389581 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" path="/var/lib/kubelet/pods/836676da-534a-42e5-b256-f7d5a2a13f22/volumes" Mar 08 20:53:44 crc kubenswrapper[4885]: I0308 20:53:44.368978 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:44 crc kubenswrapper[4885]: E0308 20:53:44.370251 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:58 crc kubenswrapper[4885]: I0308 20:53:58.368911 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:58 crc kubenswrapper[4885]: E0308 20:53:58.370071 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.167791 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168559 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-utilities" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-utilities" Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168633 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-content" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168646 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-content" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.169050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.169862 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.173217 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.173278 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.177685 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.181735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.362177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.465819 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.499271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.501647 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.849129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:01 crc kubenswrapper[4885]: I0308 20:54:01.025076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerStarted","Data":"4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1"} Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.048218 4885 generic.go:334] "Generic (PLEG): container finished" podID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerID="9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4" exitCode=0 Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.048310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerDied","Data":"9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4"} Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.920572 4885 scope.go:117] "RemoveContainer" containerID="71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.494473 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.658150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.664029 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj" (OuterVolumeSpecName: "kube-api-access-xntkj") pod "66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" (UID: "66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b"). InnerVolumeSpecName "kube-api-access-xntkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.760650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") on node \"crc\" DevicePath \"\"" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.082905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerDied","Data":"4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1"} Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.083008 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.083018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.598304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.608165 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:54:07 crc kubenswrapper[4885]: I0308 20:54:07.386098 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" path="/var/lib/kubelet/pods/8cc070c5-7e69-4caa-82a5-b21b8fa66256/volumes" Mar 08 20:54:11 crc kubenswrapper[4885]: I0308 20:54:11.369675 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:54:12 crc kubenswrapper[4885]: I0308 20:54:12.158440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} Mar 08 20:55:04 crc kubenswrapper[4885]: I0308 20:55:04.026183 4885 scope.go:117] "RemoveContainer" containerID="4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.164996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: E0308 20:56:00.166023 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166041 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166233 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166798 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.171793 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.172623 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.174297 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.184763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.274696 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.378207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.407709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.490362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.774310 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: W0308 20:56:00.778374 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97098aa1_1dc7_4efc_b2a2_0c0a97ae36f2.slice/crio-cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4 WatchSource:0}: Error finding container cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4: Status 404 returned error can't find the container with id cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4 Mar 08 20:56:01 crc kubenswrapper[4885]: I0308 20:56:01.229430 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerStarted","Data":"cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4"} Mar 08 20:56:02 crc kubenswrapper[4885]: I0308 20:56:02.242320 4885 generic.go:334] "Generic (PLEG): container finished" podID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerID="06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1" exitCode=0 Mar 08 20:56:02 crc kubenswrapper[4885]: I0308 20:56:02.242425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerDied","Data":"06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1"} Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.647347 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.732697 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.739728 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9" (OuterVolumeSpecName: "kube-api-access-ndbx9") pod "97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" (UID: "97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2"). InnerVolumeSpecName "kube-api-access-ndbx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.834400 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerDied","Data":"cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4"} Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258665 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258978 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.739897 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.757318 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:56:05 crc kubenswrapper[4885]: I0308 20:56:05.383748 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c364e354-f542-45ec-9322-125db18eb928" path="/var/lib/kubelet/pods/c364e354-f542-45ec-9322-125db18eb928/volumes" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.526534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:11 crc kubenswrapper[4885]: E0308 20:56:11.527624 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.527645 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.527950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.530240 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.546290 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671415 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.773381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.773470 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.810942 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.880731 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:12 crc kubenswrapper[4885]: I0308 20:56:12.314683 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.337829 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" exitCode=0 Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.337958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a"} Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.338081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"4f37e648886bd997e7152ec15c819b52ba5354bfc7c1ad348fdee032439da0bf"} Mar 08 20:56:14 crc kubenswrapper[4885]: I0308 20:56:14.349035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} Mar 08 20:56:15 crc kubenswrapper[4885]: I0308 20:56:15.362767 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" exitCode=0 Mar 08 20:56:15 crc kubenswrapper[4885]: I0308 20:56:15.363228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} Mar 08 20:56:16 crc kubenswrapper[4885]: I0308 20:56:16.373648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.035104 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5bmd" podStartSLOduration=5.410986352 podStartE2EDuration="8.035084233s" podCreationTimestamp="2026-03-08 20:56:11 +0000 UTC" firstStartedPulling="2026-03-08 20:56:13.342144691 +0000 UTC m=+5074.738198744" lastFinishedPulling="2026-03-08 20:56:15.966242572 +0000 UTC m=+5077.362296625" observedRunningTime="2026-03-08 20:56:16.412076739 +0000 UTC m=+5077.808130772" watchObservedRunningTime="2026-03-08 20:56:19.035084233 +0000 UTC m=+5080.431138256" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.040143 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.042329 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.054584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223288 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.325245 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.325406 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.343505 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.363988 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.880244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:20 crc kubenswrapper[4885]: I0308 20:56:20.418760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} Mar 08 20:56:20 crc kubenswrapper[4885]: I0308 20:56:20.419184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"0da2db770322b4cd9942fcab3f1443894aab6a4200b01789887cd5d5ab8a0923"} Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.433302 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" exitCode=0 Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.433389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.881761 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.883516 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:22 crc kubenswrapper[4885]: I0308 20:56:22.446630 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} Mar 08 20:56:22 crc kubenswrapper[4885]: I0308 20:56:22.965890 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5bmd" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" probeResult="failure" output=< Mar 08 20:56:22 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:56:22 crc kubenswrapper[4885]: > Mar 08 20:56:23 crc kubenswrapper[4885]: I0308 20:56:23.477372 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" exitCode=0 Mar 08 20:56:23 crc kubenswrapper[4885]: I0308 20:56:23.477435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} Mar 08 20:56:24 crc kubenswrapper[4885]: I0308 20:56:24.490633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} Mar 08 20:56:24 crc kubenswrapper[4885]: I0308 20:56:24.519323 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6mx7n" podStartSLOduration=3.083350574 podStartE2EDuration="5.519302029s" podCreationTimestamp="2026-03-08 20:56:19 +0000 UTC" firstStartedPulling="2026-03-08 20:56:21.436115425 +0000 UTC m=+5082.832169478" lastFinishedPulling="2026-03-08 20:56:23.87206687 +0000 UTC m=+5085.268120933" observedRunningTime="2026-03-08 20:56:24.51558267 +0000 UTC m=+5085.911636723" watchObservedRunningTime="2026-03-08 20:56:24.519302029 +0000 UTC m=+5085.915356052" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.364741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.365352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.440781 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.611546 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.690599 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:31 crc kubenswrapper[4885]: I0308 20:56:31.556031 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6mx7n" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" containerID="cri-o://d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" gracePeriod=2 Mar 08 20:56:31 crc kubenswrapper[4885]: I0308 20:56:31.948175 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.014281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.051389 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.159763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.160041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.160786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.161022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities" (OuterVolumeSpecName: "utilities") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.161471 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.165342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf" (OuterVolumeSpecName: "kube-api-access-29kkf") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "kube-api-access-29kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.220344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.263072 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.263116 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567674 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" exitCode=0 Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567828 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567956 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"0da2db770322b4cd9942fcab3f1443894aab6a4200b01789887cd5d5ab8a0923"} Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567990 4885 scope.go:117] "RemoveContainer" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.609389 4885 scope.go:117] "RemoveContainer" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.640640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.657619 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.660193 4885 scope.go:117] "RemoveContainer" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.708860 4885 scope.go:117] "RemoveContainer" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.711631 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": container with ID starting with d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4 not found: ID does not exist" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.711688 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} err="failed to get container status \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": rpc error: code = NotFound desc = could not find container \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": container with ID starting with d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4 not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.711720 4885 scope.go:117] "RemoveContainer" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.712424 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": container with ID starting with 7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634 not found: ID does not exist" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.712491 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} err="failed to get container status \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": rpc error: code = NotFound desc = could not find container \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": container with ID starting with 7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634 not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.712533 4885 scope.go:117] "RemoveContainer" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.713162 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": container with ID starting with a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b not found: ID does not exist" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.713205 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} err="failed to get container status \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": rpc error: code = NotFound desc = could not find container \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": container with ID starting with a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.818423 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.818502 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.385155 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" path="/var/lib/kubelet/pods/12ebdbbf-4422-4e98-acc5-cca08fcb3444/volumes" Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.894165 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.894588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5bmd" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" containerID="cri-o://a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" gracePeriod=2 Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.379841 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.497705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.497889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.498004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.498554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities" (OuterVolumeSpecName: "utilities") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.501517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf" (OuterVolumeSpecName: "kube-api-access-nh9jf") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "kube-api-access-nh9jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586134 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" exitCode=0 Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586210 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"4f37e648886bd997e7152ec15c819b52ba5354bfc7c1ad348fdee032439da0bf"} Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586256 4885 scope.go:117] "RemoveContainer" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.599893 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.599934 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.605966 4885 scope.go:117] "RemoveContainer" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.625552 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.641468 4885 scope.go:117] "RemoveContainer" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.666755 4885 scope.go:117] "RemoveContainer" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.667250 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": container with ID starting with a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16 not found: ID does not exist" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667288 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} err="failed to get container status \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": rpc error: code = NotFound desc = could not find container \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": container with ID starting with a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16 not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667313 4885 scope.go:117] "RemoveContainer" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.667801 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": container with ID starting with 8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350 not found: ID does not exist" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} err="failed to get container status \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": rpc error: code = NotFound desc = could not find container \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": container with ID starting with 8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350 not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667958 4885 scope.go:117] "RemoveContainer" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.668386 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": container with ID starting with 736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a not found: ID does not exist" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.668427 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a"} err="failed to get container status \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": rpc error: code = NotFound desc = could not find container \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": container with ID starting with 736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.702058 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.937151 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.947942 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:35 crc kubenswrapper[4885]: I0308 20:56:35.384386 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" path="/var/lib/kubelet/pods/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e/volumes" Mar 08 20:57:02 crc kubenswrapper[4885]: I0308 20:57:02.818650 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:57:02 crc kubenswrapper[4885]: I0308 20:57:02.820679 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:57:04 crc kubenswrapper[4885]: I0308 20:57:04.160170 4885 scope.go:117] "RemoveContainer" containerID="00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.818768 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.819417 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.819502 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.820703 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.820834 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" gracePeriod=600 Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116435 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" exitCode=0 Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116559 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:57:34 crc kubenswrapper[4885]: I0308 20:57:34.134040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.179844 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181475 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181497 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181515 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181561 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181578 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181609 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181629 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181649 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181668 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182094 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182135 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182821 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.191092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.194449 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.195286 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.200224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.229456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.331345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.358207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.509785 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.001092 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.172035 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.387809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerStarted","Data":"10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796"} Mar 08 20:58:02 crc kubenswrapper[4885]: I0308 20:58:02.397650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerStarted","Data":"21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4"} Mar 08 20:58:02 crc kubenswrapper[4885]: I0308 20:58:02.418430 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" podStartSLOduration=1.598702472 podStartE2EDuration="2.418404699s" podCreationTimestamp="2026-03-08 20:58:00 +0000 UTC" firstStartedPulling="2026-03-08 20:58:01.171752288 +0000 UTC m=+5182.567806311" lastFinishedPulling="2026-03-08 20:58:01.991454475 +0000 UTC m=+5183.387508538" observedRunningTime="2026-03-08 20:58:02.414269099 +0000 UTC m=+5183.810323132" watchObservedRunningTime="2026-03-08 20:58:02.418404699 +0000 UTC m=+5183.814458762" Mar 08 20:58:03 crc kubenswrapper[4885]: I0308 20:58:03.406486 4885 generic.go:334] "Generic (PLEG): container finished" podID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerID="21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4" exitCode=0 Mar 08 20:58:03 crc kubenswrapper[4885]: I0308 20:58:03.406531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerDied","Data":"21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4"} Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.296357 4885 scope.go:117] "RemoveContainer" containerID="320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.750970 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.827280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.835684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj" (OuterVolumeSpecName: "kube-api-access-2kmvj") pod "95f14f7f-4dec-4d9d-a320-7a5c927d4983" (UID: "95f14f7f-4dec-4d9d-a320-7a5c927d4983"). InnerVolumeSpecName "kube-api-access-2kmvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.930715 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerDied","Data":"10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796"} Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430320 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430396 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.499661 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.510302 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:58:07 crc kubenswrapper[4885]: I0308 20:58:07.394064 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" path="/var/lib/kubelet/pods/2414c1e7-ce59-4c76-865d-1a5ffa71578f/volumes" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.468606 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:09 crc kubenswrapper[4885]: E0308 20:58:09.469335 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.469352 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.469660 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.470270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.472790 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xp44w" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.476136 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.509779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.509840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.611648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.612103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.615833 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.615894 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/acd8297510132edeb9d5328b0d06a30d4f8877acc00b01d77a9c3ca4476f150c/globalmount\"" pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.640461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.658676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.797011 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.135534 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:10 crc kubenswrapper[4885]: W0308 20:58:10.147598 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a10ccbd_e30c_478f_84a4_c869a8cd0924.slice/crio-035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688 WatchSource:0}: Error finding container 035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688: Status 404 returned error can't find the container with id 035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688 Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.474137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerStarted","Data":"ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69"} Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.474223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerStarted","Data":"035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688"} Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.504035 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.50400708 podStartE2EDuration="2.50400708s" podCreationTimestamp="2026-03-08 20:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:58:10.500329812 +0000 UTC m=+5191.896383865" watchObservedRunningTime="2026-03-08 20:58:10.50400708 +0000 UTC m=+5191.900061133" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.693266 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.694665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.710810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.887605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.989451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:14 crc kubenswrapper[4885]: I0308 20:58:14.361947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:14 crc kubenswrapper[4885]: I0308 20:58:14.659405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.205174 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:15 crc kubenswrapper[4885]: W0308 20:58:15.211173 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbeea9e_d5fb_42aa_8ad9_a687fc50c96f.slice/crio-90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89 WatchSource:0}: Error finding container 90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89: Status 404 returned error can't find the container with id 90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89 Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528526 4885 generic.go:334] "Generic (PLEG): container finished" podID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerID="cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb" exitCode=0 Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f","Type":"ContainerDied","Data":"cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb"} Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528629 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f","Type":"ContainerStarted","Data":"90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89"} Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.933979 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.959330 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f/mariadb-client/0.log" Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.989062 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.995392 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.042835 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.051102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7" (OuterVolumeSpecName: "kube-api-access-hwsz7") pod "ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" (UID: "ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f"). InnerVolumeSpecName "kube-api-access-hwsz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.114458 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: E0308 20:58:17.115009 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.115042 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.115290 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.116106 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.145612 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.151912 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.253280 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.354749 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.378501 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" path="/var/lib/kubelet/pods/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f/volumes" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.380310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.485625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.543538 4885 scope.go:117] "RemoveContainer" containerID="cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.543646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.912384 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: W0308 20:58:17.925175 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030a880e_43ba_49b0_a593_248f9c58df16.slice/crio-dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23 WatchSource:0}: Error finding container dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23: Status 404 returned error can't find the container with id dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23 Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553004 4885 generic.go:334] "Generic (PLEG): container finished" podID="030a880e-43ba-49b0-a593-248f9c58df16" containerID="7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2" exitCode=0 Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553055 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"030a880e-43ba-49b0-a593-248f9c58df16","Type":"ContainerDied","Data":"7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2"} Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"030a880e-43ba-49b0-a593-248f9c58df16","Type":"ContainerStarted","Data":"dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23"} Mar 08 20:58:19 crc kubenswrapper[4885]: I0308 20:58:19.968470 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:19 crc kubenswrapper[4885]: I0308 20:58:19.993266 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_030a880e-43ba-49b0-a593-248f9c58df16/mariadb-client/0.log" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.007584 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"030a880e-43ba-49b0-a593-248f9c58df16\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.016347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74" (OuterVolumeSpecName: "kube-api-access-8vx74") pod "030a880e-43ba-49b0-a593-248f9c58df16" (UID: "030a880e-43ba-49b0-a593-248f9c58df16"). InnerVolumeSpecName "kube-api-access-8vx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.031828 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.043470 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.109136 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.574900 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.575091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:21 crc kubenswrapper[4885]: I0308 20:58:21.386418 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030a880e-43ba-49b0-a593-248f9c58df16" path="/var/lib/kubelet/pods/030a880e-43ba-49b0-a593-248f9c58df16/volumes" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.213419 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: E0308 20:58:59.216637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.216891 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.218843 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.220689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.229286 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.230448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h588k" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.237855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.245617 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.266345 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.267831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.284795 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.286471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.303068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.316473 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319669 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319825 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.320218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.320293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421800 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421884 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422020 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422409 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422460 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424668 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.429895 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.429984 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca9173663b5e8869f2a1d06ba2f0d2643b686ec230901091c64e10437141f124/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.439695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.447139 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.449650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.454702 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8wnqg" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.455108 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.455343 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.464043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.473693 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.476817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.486410 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.490078 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.512325 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.513000 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.519140 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.523914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.523985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524039 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524110 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524132 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524190 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524328 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.525457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.525705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.526473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.526676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.529491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.530423 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536563 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536592 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ebd3a6eda502377f605d32caed44b01abe458065c6f1cd0a759ff1bb8cc7eb5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.540042 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.540062 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b8422d04854d1c8065197f1c0893c01a2f78fb0cd1fc2b94e00e55b33d539b5/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.541487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.545200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.546333 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.559392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.566428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.567119 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.586512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.604396 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.625626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626223 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626478 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.627704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.628551 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.628591 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5ed38bef82799b6681aaf26cc928b27424b3c057ebb6776ac2ea2fccf1a63e7/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629345 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629438 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629530 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629586 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.631690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.634735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.647228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.684618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730783 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731752 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.732361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.732701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.733775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.734289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735511 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735537 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be4d4bdf9879c17893725f2f238c9378caa9301312b00496f3cf5ea61073dfc0/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735813 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735863 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/633348a338d920df1dcb058c2c84bb4f6e6614f10dd6ecfc5cc803d00153f8ed/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.737581 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.749164 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.753060 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.755810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.756694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.780611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.781331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.926109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.948775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.961023 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.111341 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:59:00 crc kubenswrapper[4885]: W0308 20:59:00.129215 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c0a1292_7594_49d3_b3f0_2e1a6aa004e2.slice/crio-9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2 WatchSource:0}: Error finding container 9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2: Status 404 returned error can't find the container with id 9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2 Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.201793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.455194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.574232 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"1dade7de7daa8c5b89bebaa909d702ebb223d1d31a52106ba20c6947908e5d83"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957684 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"059118c9ef699bb0b280b108a95e8510ec7e80cc6e933f0f02eec053a1e5d826"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"cccdd8e1056c26fbb563e37aaf0672d94926523626c320b579eed1aaafddfc5c"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960034 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"5bacc20730bf2d7aa32126ec4c9edafec9f84aae11176f80c5cfba1ec377c323"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"3ee9a18d842fd2f2b2f73d5ecc8f53993bee1e2028582d670b18f9918d61d8f5"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"2e8c03152d595674364d783b3c1eaa57676fb9c7422c0d37f82b7633d05b93e6"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"f1e1b98f69ad7a962faf3c5f57c68ef4c90eb0c0d61884d215b9f2fb78b69ca1"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"3e82c8cba2d8ea79bd3e21615834f39903a3f11fc300d281ec2ff5877a33c43e"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964033 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"ab22ce67a252b13b4cd81306e4f70041e4fc1d065efff6149eef4d8dcd870c68"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"b6feceebc499df4d23791119af9e17693560f82576bfa59023005f97c477a95d"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964073 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"b37832bdb20284f645c1f15ef8bca5e51a0a279a99b0a6a9f355cb43ca221e4d"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.981960 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.981915751 podStartE2EDuration="2.981915751s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:00.97515165 +0000 UTC m=+5242.371205673" watchObservedRunningTime="2026-03-08 20:59:00.981915751 +0000 UTC m=+5242.377969794" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.000711 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.000688231 podStartE2EDuration="3.000688231s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:00.994311122 +0000 UTC m=+5242.390365155" watchObservedRunningTime="2026-03-08 20:59:01.000688231 +0000 UTC m=+5242.396742264" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.019295 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.019272457 podStartE2EDuration="3.019272457s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.018004943 +0000 UTC m=+5242.414058976" watchObservedRunningTime="2026-03-08 20:59:01.019272457 +0000 UTC m=+5242.415326490" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.048505 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.048480726 podStartE2EDuration="3.048480726s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.044974522 +0000 UTC m=+5242.441028575" watchObservedRunningTime="2026-03-08 20:59:01.048480726 +0000 UTC m=+5242.444534769" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.104025 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.231069 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:59:01 crc kubenswrapper[4885]: W0308 20:59:01.239607 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd605915b_24f4_45ec_bb13_7e7097bb288b.slice/crio-4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e WatchSource:0}: Error finding container 4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e: Status 404 returned error can't find the container with id 4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"2653ab3aa7af4c7cb424010346b54c092a232b9945f7a88f950dea7374140593"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"8eb1bbf1deaa7131fbd32b22b9dea7d99848880570be4d9d2e0a5ec34156adfc"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977719 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.979975 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"ee90ca4eb708fa670d4ede507e4dcbb3d4af9ac401b0877e603c6b4428f5034d"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.980133 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"7b7e53d4b24c99f965824abec94febd3afad0c358834f19eaf04ccb79a91cfae"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.980172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"6e74e98ff97e97200a83ea01eb663ed85dc6711b32db471352dd506f4d7751bb"} Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.009609 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.009562623 podStartE2EDuration="4.009562623s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.993592518 +0000 UTC m=+5243.389646541" watchObservedRunningTime="2026-03-08 20:59:02.009562623 +0000 UTC m=+5243.405616686" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.039449 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.039382678 podStartE2EDuration="4.039382678s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:02.019131148 +0000 UTC m=+5243.415185261" watchObservedRunningTime="2026-03-08 20:59:02.039382678 +0000 UTC m=+5243.435436741" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.560376 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.586884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.605248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.926798 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.949190 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.961687 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.369427 4885 scope.go:117] "RemoveContainer" containerID="4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.560094 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.587525 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.604854 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.926458 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.949869 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.961633 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.442464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.445259 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.454872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636770 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739304 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.776897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.799942 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.800039 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.802807 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.835974 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.846638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.964578 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.994599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.033415 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.059097 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.074637 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.085181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.118102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.130591 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.133122 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.190540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.194965 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.212536 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.341957 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: E0308 20:59:06.342570 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-nmcd4 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" podUID="d8c57abb-ec77-4ac1-9ca5-913f466c13ab" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365986 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.366021 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.371881 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.373680 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.376592 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.382206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471814 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.491101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573905 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573939 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573981 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.575069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.575242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.594881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.656807 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.695703 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028117 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" exitCode=0 Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d"} Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerStarted","Data":"988413fd6813a018a052c35557173c8058b1b56cf0dc5288e4ec2c652d437f89"} Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.029628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.044695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.081598 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.083879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config" (OuterVolumeSpecName: "config") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.084292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.089566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4" (OuterVolumeSpecName: "kube-api-access-nmcd4") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "kube-api-access-nmcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185823 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185901 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185970 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185991 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: W0308 20:59:07.200569 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97015d9c_53b3_463a_8953_0c5338fbaefe.slice/crio-9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640 WatchSource:0}: Error finding container 9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640: Status 404 returned error can't find the container with id 9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640 Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.200742 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.039888 4885 generic.go:334] "Generic (PLEG): container finished" podID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" exitCode=0 Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.039974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c"} Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.040210 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.040243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerStarted","Data":"9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640"} Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.269887 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.275658 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.054538 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerStarted","Data":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.061114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.062669 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" exitCode=0 Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.062714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d"} Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.089675 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" podStartSLOduration=3.089651524 podStartE2EDuration="3.089651524s" podCreationTimestamp="2026-03-08 20:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:09.083576741 +0000 UTC m=+5250.479630804" watchObservedRunningTime="2026-03-08 20:59:09.089651524 +0000 UTC m=+5250.485705557" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.141060 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.142085 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.147220 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.156533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.220853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.221009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.221111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.323901 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.324080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.324130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.329452 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.329720 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d13b9839d6363de7430b2dc3885042ff5afd37c33060049b87006ee21f82e9a/globalmount\"" pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.335432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.360500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.391772 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c57abb-ec77-4ac1-9ca5-913f466c13ab" path="/var/lib/kubelet/pods/d8c57abb-ec77-4ac1-9ca5-913f466c13ab/volumes" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.396099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.479507 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.075433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerStarted","Data":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.097832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.106514 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hq8r2" podStartSLOduration=2.552663489 podStartE2EDuration="5.106494307s" podCreationTimestamp="2026-03-08 20:59:05 +0000 UTC" firstStartedPulling="2026-03-08 20:59:07.0306497 +0000 UTC m=+5248.426703733" lastFinishedPulling="2026-03-08 20:59:09.584480528 +0000 UTC m=+5250.980534551" observedRunningTime="2026-03-08 20:59:10.106283862 +0000 UTC m=+5251.502337885" watchObservedRunningTime="2026-03-08 20:59:10.106494307 +0000 UTC m=+5251.502548330" Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.083109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerStarted","Data":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.083668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerStarted","Data":"2b2a6f955da79537fe6939ae44e2e8e65e67c9ab78da8f292a75babe5150e678"} Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.105616 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.562767313 podStartE2EDuration="3.105590998s" podCreationTimestamp="2026-03-08 20:59:08 +0000 UTC" firstStartedPulling="2026-03-08 20:59:10.110672799 +0000 UTC m=+5251.506726842" lastFinishedPulling="2026-03-08 20:59:10.653496504 +0000 UTC m=+5252.049550527" observedRunningTime="2026-03-08 20:59:11.09815977 +0000 UTC m=+5252.494213833" watchObservedRunningTime="2026-03-08 20:59:11.105590998 +0000 UTC m=+5252.501645031" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.075323 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.076061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.139089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.217299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.396980 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.697977 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.772484 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.946904 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.949138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.951399 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.959301 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.959957 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hg5dp" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.960499 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073549 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073758 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.134137 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" containerID="cri-o://04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" gracePeriod=10 Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175337 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.176289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.176689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.180678 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.190182 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.272193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.827433 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.924457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b" (OuterVolumeSpecName: "kube-api-access-9vx9b") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "kube-api-access-9vx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.938110 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:17 crc kubenswrapper[4885]: W0308 20:59:17.947410 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e81fc01_0a65_4956_9ba5_26ec5f7c25c9.slice/crio-d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c WatchSource:0}: Error finding container d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c: Status 404 returned error can't find the container with id d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.964914 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.981112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config" (OuterVolumeSpecName: "config") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020115 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020143 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020152 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144676 4885 generic.go:334] "Generic (PLEG): container finished" podID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" exitCode=0 Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144781 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144875 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"e38cb66edab29ad6cc751ddfec546724fea6786800c820b585d732bbfd60f672"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144895 4885 scope.go:117] "RemoveContainer" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"e8bc72a3b9172c08f37114013e31d81e70218a0ddae0d38248d1660be4df7772"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147442 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hq8r2" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" containerID="cri-o://3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" gracePeriod=2 Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.177186 4885 scope.go:117] "RemoveContainer" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.184606 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.192419 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203224 4885 scope.go:117] "RemoveContainer" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: E0308 20:59:18.203554 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": container with ID starting with 04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5 not found: ID does not exist" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203582 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} err="failed to get container status \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": rpc error: code = NotFound desc = could not find container \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": container with ID starting with 04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5 not found: ID does not exist" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203605 4885 scope.go:117] "RemoveContainer" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: E0308 20:59:18.203781 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": container with ID starting with 42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43 not found: ID does not exist" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203800 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43"} err="failed to get container status \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": rpc error: code = NotFound desc = could not find container \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": container with ID starting with 42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43 not found: ID does not exist" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.593231 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.632988 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities" (OuterVolumeSpecName: "utilities") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.641226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf" (OuterVolumeSpecName: "kube-api-access-wnqrf") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "kube-api-access-wnqrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.672701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733096 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733402 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733445 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.158693 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"662d70db5585e2c725bc484b19e622e88e9a75dd69b9c6a2613a47dbbb6e9f00"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.159095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162765 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" exitCode=0 Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162851 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"988413fd6813a018a052c35557173c8058b1b56cf0dc5288e4ec2c652d437f89"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162874 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.163014 4885 scope.go:117] "RemoveContainer" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.203621 4885 scope.go:117] "RemoveContainer" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.210376 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.210344291 podStartE2EDuration="3.210344291s" podCreationTimestamp="2026-03-08 20:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:19.188299533 +0000 UTC m=+5260.584353596" watchObservedRunningTime="2026-03-08 20:59:19.210344291 +0000 UTC m=+5260.606398324" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.229156 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.235726 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.240107 4885 scope.go:117] "RemoveContainer" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.263815 4885 scope.go:117] "RemoveContainer" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.267491 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": container with ID starting with 3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483 not found: ID does not exist" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.267521 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} err="failed to get container status \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": rpc error: code = NotFound desc = could not find container \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": container with ID starting with 3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483 not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.267540 4885 scope.go:117] "RemoveContainer" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.268240 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": container with ID starting with c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d not found: ID does not exist" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d"} err="failed to get container status \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": rpc error: code = NotFound desc = could not find container \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": container with ID starting with c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268346 4885 scope.go:117] "RemoveContainer" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.268912 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": container with ID starting with 7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d not found: ID does not exist" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268945 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d"} err="failed to get container status \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": rpc error: code = NotFound desc = could not find container \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": container with ID starting with 7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.388669 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" path="/var/lib/kubelet/pods/8e1559f2-4966-4752-8c07-aea40781bbd3/volumes" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.389875 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" path="/var/lib/kubelet/pods/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d/volumes" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.171376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-content" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172080 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-content" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172100 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172106 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172126 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-utilities" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172142 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-utilities" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172156 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="init" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="init" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172317 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172332 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.209500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.209607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.215226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.264502 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.265534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.267983 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.270384 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.311740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.311779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.312458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.329910 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.415060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.415633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.516556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.516649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.517469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.522722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.544714 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.586535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.813019 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: W0308 20:59:22.817629 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af0fb78_1571_4090_a0e4_009deb2915a5.slice/crio-96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a WatchSource:0}: Error finding container 96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a: Status 404 returned error can't find the container with id 96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.123649 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:23 crc kubenswrapper[4885]: W0308 20:59:23.126258 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadcfb24_7e2e_42d4_b4da_4567105c11ad.slice/crio-a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957 WatchSource:0}: Error finding container a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957: Status 404 returned error can't find the container with id a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957 Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.206628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerStarted","Data":"a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.208477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerStarted","Data":"2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.208518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerStarted","Data":"96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.221548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-dkqb5" podStartSLOduration=1.22153053 podStartE2EDuration="1.22153053s" podCreationTimestamp="2026-03-08 20:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:23.220623125 +0000 UTC m=+5264.616677148" watchObservedRunningTime="2026-03-08 20:59:23.22153053 +0000 UTC m=+5264.617584553" Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.221863 4885 generic.go:334] "Generic (PLEG): container finished" podID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerID="2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488" exitCode=0 Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.222008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerDied","Data":"2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488"} Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.224825 4885 generic.go:334] "Generic (PLEG): container finished" podID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerID="b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904" exitCode=0 Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.224895 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerDied","Data":"b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904"} Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.751855 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.763682 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"2af0fb78-1571-4090-a0e4-009deb2915a5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783771 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"2af0fb78-1571-4090-a0e4-009deb2915a5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.784530 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af0fb78-1571-4090-a0e4-009deb2915a5" (UID: "2af0fb78-1571-4090-a0e4-009deb2915a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.784680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dadcfb24-7e2e-42d4-b4da-4567105c11ad" (UID: "dadcfb24-7e2e-42d4-b4da-4567105c11ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.790707 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4" (OuterVolumeSpecName: "kube-api-access-nxbs4") pod "2af0fb78-1571-4090-a0e4-009deb2915a5" (UID: "2af0fb78-1571-4090-a0e4-009deb2915a5"). InnerVolumeSpecName "kube-api-access-nxbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.795404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6" (OuterVolumeSpecName: "kube-api-access-9k8d6") pod "dadcfb24-7e2e-42d4-b4da-4567105c11ad" (UID: "dadcfb24-7e2e-42d4-b4da-4567105c11ad"). InnerVolumeSpecName "kube-api-access-9k8d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885560 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885606 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885618 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885631 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerDied","Data":"96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a"} Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253267 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253343 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.256887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerDied","Data":"a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957"} Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.256976 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.257049 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.737892 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:27 crc kubenswrapper[4885]: E0308 20:59:27.738510 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738523 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: E0308 20:59:27.738544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738553 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738701 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.739252 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.741847 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.742079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.742404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.743522 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.752929 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824362 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926384 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.931104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.931679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.943472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:28 crc kubenswrapper[4885]: I0308 20:59:28.062513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:28 crc kubenswrapper[4885]: I0308 20:59:28.591802 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:28 crc kubenswrapper[4885]: W0308 20:59:28.597735 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda21d9a63_6439_41e2_915d_9ffa3d014a30.slice/crio-e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377 WatchSource:0}: Error finding container e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377: Status 404 returned error can't find the container with id e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377 Mar 08 20:59:29 crc kubenswrapper[4885]: I0308 20:59:29.287691 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerStarted","Data":"e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377"} Mar 08 20:59:30 crc kubenswrapper[4885]: I0308 20:59:30.298697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerStarted","Data":"3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6"} Mar 08 20:59:30 crc kubenswrapper[4885]: I0308 20:59:30.331359 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-nbq5w" podStartSLOduration=3.331333552 podStartE2EDuration="3.331333552s" podCreationTimestamp="2026-03-08 20:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:30.323256637 +0000 UTC m=+5271.719310700" watchObservedRunningTime="2026-03-08 20:59:30.331333552 +0000 UTC m=+5271.727387575" Mar 08 20:59:31 crc kubenswrapper[4885]: I0308 20:59:31.314043 4885 generic.go:334] "Generic (PLEG): container finished" podID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerID="3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6" exitCode=0 Mar 08 20:59:31 crc kubenswrapper[4885]: I0308 20:59:31.314123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerDied","Data":"3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6"} Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.782653 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.930621 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.931210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.931420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.942129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl" (OuterVolumeSpecName: "kube-api-access-82tsl") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "kube-api-access-82tsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.978792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.007271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data" (OuterVolumeSpecName: "config-data") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033366 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033406 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033420 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336594 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerDied","Data":"e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377"} Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336639 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336722 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068397 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:34 crc kubenswrapper[4885]: E0308 20:59:34.068787 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068801 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068999 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.069517 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073246 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073597 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.077290 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.077582 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.125095 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.126455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.130597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.138974 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152179 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152373 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.253818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255839 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256285 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.257800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.259292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.259307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.260082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.274078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.358848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.363772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.364270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.364280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.392234 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.443266 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.689516 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.921705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: W0308 20:59:34.928592 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4efef13_d123_4300_b581_2a9a52de6d1b.slice/crio-951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7 WatchSource:0}: Error finding container 951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7: Status 404 returned error can't find the container with id 951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7 Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.165710 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.354823 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" exitCode=0 Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.354899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9"} Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.355232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerStarted","Data":"951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7"} Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.357307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerStarted","Data":"b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.368570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerStarted","Data":"93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.371447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerStarted","Data":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.371828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.396756 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dt79q" podStartSLOduration=2.396731454 podStartE2EDuration="2.396731454s" podCreationTimestamp="2026-03-08 20:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:36.393835167 +0000 UTC m=+5277.789889200" watchObservedRunningTime="2026-03-08 20:59:36.396731454 +0000 UTC m=+5277.792785477" Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.424501 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" podStartSLOduration=2.424479934 podStartE2EDuration="2.424479934s" podCreationTimestamp="2026-03-08 20:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:36.42136522 +0000 UTC m=+5277.817419243" watchObservedRunningTime="2026-03-08 20:59:36.424479934 +0000 UTC m=+5277.820533957" Mar 08 20:59:37 crc kubenswrapper[4885]: I0308 20:59:37.385218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 20:59:39 crc kubenswrapper[4885]: I0308 20:59:39.403255 4885 generic.go:334] "Generic (PLEG): container finished" podID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerID="93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59" exitCode=0 Mar 08 20:59:39 crc kubenswrapper[4885]: I0308 20:59:39.403378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerDied","Data":"93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59"} Mar 08 20:59:40 crc kubenswrapper[4885]: I0308 20:59:40.942063 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.079897 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080452 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080563 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.086536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.087331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts" (OuterVolumeSpecName: "scripts") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.087254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z" (OuterVolumeSpecName: "kube-api-access-kmq7z") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "kube-api-access-kmq7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.089395 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.107577 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data" (OuterVolumeSpecName: "config-data") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.108514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183662 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183695 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183705 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183714 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183724 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183733 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.430911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerDied","Data":"b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792"} Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.431002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.431039 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.528589 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.542401 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.627854 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:41 crc kubenswrapper[4885]: E0308 20:59:41.628303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.628325 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.628503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.629165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.638081 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.669598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670008 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670243 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670540 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.671128 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.693961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694512 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694696 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.802467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.802804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.806967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.814624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.815208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.818076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.993818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:42 crc kubenswrapper[4885]: I0308 20:59:42.923484 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.388412 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" path="/var/lib/kubelet/pods/8816a3aa-9268-4201-9ad0-bc816fdaba11/volumes" Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.449754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerStarted","Data":"de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6"} Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.450103 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerStarted","Data":"44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c"} Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.474944 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bqsgq" podStartSLOduration=2.474907733 podStartE2EDuration="2.474907733s" podCreationTimestamp="2026-03-08 20:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:43.469540029 +0000 UTC m=+5284.865594062" watchObservedRunningTime="2026-03-08 20:59:43.474907733 +0000 UTC m=+5284.870961776" Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.444095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.524139 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.524451 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" containerID="cri-o://4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" gracePeriod=10 Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.011512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.064093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj" (OuterVolumeSpecName: "kube-api-access-d2cnj") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "kube-api-access-d2cnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.095192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.100066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.110368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config" (OuterVolumeSpecName: "config") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.112631 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157697 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157723 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157735 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157757 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157767 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468376 4885 generic.go:334] "Generic (PLEG): container finished" podID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" exitCode=0 Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468796 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640"} Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468831 4885 scope.go:117] "RemoveContainer" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.504140 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.518114 4885 scope.go:117] "RemoveContainer" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.521597 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542139 4885 scope.go:117] "RemoveContainer" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: E0308 20:59:45.542454 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": container with ID starting with 4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4 not found: ID does not exist" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} err="failed to get container status \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": rpc error: code = NotFound desc = could not find container \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": container with ID starting with 4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4 not found: ID does not exist" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542507 4885 scope.go:117] "RemoveContainer" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: E0308 20:59:45.542880 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": container with ID starting with 478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c not found: ID does not exist" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542905 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c"} err="failed to get container status \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": rpc error: code = NotFound desc = could not find container \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": container with ID starting with 478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c not found: ID does not exist" Mar 08 20:59:46 crc kubenswrapper[4885]: I0308 20:59:46.483529 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerID="de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6" exitCode=0 Mar 08 20:59:46 crc kubenswrapper[4885]: I0308 20:59:46.483578 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerDied","Data":"de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6"} Mar 08 20:59:47 crc kubenswrapper[4885]: I0308 20:59:47.386912 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" path="/var/lib/kubelet/pods/97015d9c-53b3-463a-8953-0c5338fbaefe/volumes" Mar 08 20:59:47 crc kubenswrapper[4885]: I0308 20:59:47.859573 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010776 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010844 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011055 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011091 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.017802 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts" (OuterVolumeSpecName: "scripts") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.018989 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs" (OuterVolumeSpecName: "kube-api-access-8gxzs") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "kube-api-access-8gxzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.019160 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.019407 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.039109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.055582 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data" (OuterVolumeSpecName: "config-data") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112863 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112909 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112940 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112952 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112962 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112971 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerDied","Data":"44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c"} Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504233 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504395 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.618442 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619057 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="init" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619080 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="init" Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619130 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619148 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619158 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619403 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619427 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.620585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.623819 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.626768 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.627163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.627386 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.631711 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.722879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723062 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723092 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723783 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825284 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.829628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.829852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.830804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.832785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.833718 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.845273 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.945283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.209764 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b8c986d-gxm79" event={"ID":"65bf82e2-5440-45b2-b1ff-1f6998ce46f8","Type":"ContainerStarted","Data":"104827b6c72c1a8e7cd022fab280bd3e172f1884d754f2d955aad0d9afa3f9eb"} Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b8c986d-gxm79" event={"ID":"65bf82e2-5440-45b2-b1ff-1f6998ce46f8","Type":"ContainerStarted","Data":"e72c2714ba37458d0847554e1cc8423328a7dc1970d1cb54b19ab93f33fc6e3e"} Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.540812 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-868b8c986d-gxm79" podStartSLOduration=1.540795559 podStartE2EDuration="1.540795559s" podCreationTimestamp="2026-03-08 20:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:49.539384261 +0000 UTC m=+5290.935438284" watchObservedRunningTime="2026-03-08 20:59:49.540795559 +0000 UTC m=+5290.936849582" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.149264 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.150663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.154131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.155494 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.156038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.163831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.166369 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.170287 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.170780 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.174676 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.186323 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345592 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345999 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447797 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447960 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.448004 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.449737 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.459395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.468263 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.477266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.496765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.504549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.078383 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.133228 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:01 crc kubenswrapper[4885]: W0308 21:00:01.139813 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c3bcc1_5dd6_411d_8030_a152617aa0a3.slice/crio-204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc WatchSource:0}: Error finding container 204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc: Status 404 returned error can't find the container with id 204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.668780 4885 generic.go:334] "Generic (PLEG): container finished" podID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerID="e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea" exitCode=0 Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.668848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerDied","Data":"e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea"} Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.669392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerStarted","Data":"204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc"} Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.671973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerStarted","Data":"299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae"} Mar 08 21:00:02 crc kubenswrapper[4885]: I0308 21:00:02.818696 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:00:02 crc kubenswrapper[4885]: I0308 21:00:02.819207 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.042906 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.199215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.205213 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.205832 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf" (OuterVolumeSpecName: "kube-api-access-hwdhf") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "kube-api-access-hwdhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300485 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300600 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300654 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerDied","Data":"204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc"} Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691829 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691851 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.132789 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.145196 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.701959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerStarted","Data":"c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5"} Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.726586 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" podStartSLOduration=1.6144750829999999 podStartE2EDuration="4.726559516s" podCreationTimestamp="2026-03-08 21:00:00 +0000 UTC" firstStartedPulling="2026-03-08 21:00:01.091533939 +0000 UTC m=+5302.487587992" lastFinishedPulling="2026-03-08 21:00:04.203618412 +0000 UTC m=+5305.599672425" observedRunningTime="2026-03-08 21:00:04.71881085 +0000 UTC m=+5306.114864903" watchObservedRunningTime="2026-03-08 21:00:04.726559516 +0000 UTC m=+5306.122613579" Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.385586 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" path="/var/lib/kubelet/pods/19d8979b-517e-4b02-8f5a-ead2361596ea/volumes" Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.713298 4885 generic.go:334] "Generic (PLEG): container finished" podID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerID="c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5" exitCode=0 Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.713352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerDied","Data":"c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5"} Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.109743 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.269457 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"a0100b61-a97f-40b6-b8fd-91499667f3d9\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.276030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7" (OuterVolumeSpecName: "kube-api-access-wxjc7") pod "a0100b61-a97f-40b6-b8fd-91499667f3d9" (UID: "a0100b61-a97f-40b6-b8fd-91499667f3d9"). InnerVolumeSpecName "kube-api-access-wxjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.371872 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.738575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerDied","Data":"299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae"} Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.739082 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.738713 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.806443 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.815490 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 21:00:09 crc kubenswrapper[4885]: I0308 21:00:09.386121 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" path="/var/lib/kubelet/pods/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b/volumes" Mar 08 21:00:20 crc kubenswrapper[4885]: I0308 21:00:20.382157 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 21:00:21 crc kubenswrapper[4885]: I0308 21:00:21.822630 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:00:21 crc kubenswrapper[4885]: I0308 21:00:21.860587 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-hq28v" podUID="dca42faa-df32-44b5-99e8-109120aa36a1" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.151123 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.151868 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.151917 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.151994 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152012 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152443 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152496 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.153718 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.156730 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.158017 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q5rnq" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.164347 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.169214 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.179886 4885 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-config\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.185369 4885 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243135a1-f055-4b63-b640-6f751ce8bd08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b0a455c844ca790160c48aed1aaf8bc69ceb4b9ed4a4fa1717114e6e2e2fda9\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxzhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T21:00:22Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.186493 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.187039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cxzhn openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-cxzhn openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="243135a1-f055-4b63-b640-6f751ce8bd08" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.199774 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.219497 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.220725 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.230050 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.232939 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247076 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247134 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349228 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349301 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.350041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.352361 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cxzhn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.352418 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn podName:243135a1-f055-4b63-b640-6f751ce8bd08 nodeName:}" failed. No retries permitted until 2026-03-08 21:00:22.852402677 +0000 UTC m=+5324.248456700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cxzhn" (UniqueName: "kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn") pod "openstackclient" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.359911 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450484 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450817 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.452011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.456341 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.479259 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.585611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.863488 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.890072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.891491 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cxzhn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.891568 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn podName:243135a1-f055-4b63-b640-6f751ce8bd08 nodeName:}" failed. No retries permitted until 2026-03-08 21:00:23.891547574 +0000 UTC m=+5325.287601617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxzhn" (UniqueName: "kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn") pod "openstackclient" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.900814 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.901402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f7e4501e-3805-4590-b759-f520d3f98787","Type":"ContainerStarted","Data":"5df35decd7c07684767af516967f3d25c57bef185a8f0f1d37e89e339fed67be"} Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.906524 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.909873 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.912082 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.991578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"243135a1-f055-4b63-b640-6f751ce8bd08\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.991681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"243135a1-f055-4b63-b640-6f751ce8bd08\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.992093 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.992257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "243135a1-f055-4b63-b640-6f751ce8bd08" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.996047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "243135a1-f055-4b63-b640-6f751ce8bd08" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.093783 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.093811 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.219704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.383511 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243135a1-f055-4b63-b640-6f751ce8bd08" path="/var/lib/kubelet/pods/243135a1-f055-4b63-b640-6f751ce8bd08/volumes" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.912666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.912682 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f7e4501e-3805-4590-b759-f520d3f98787","Type":"ContainerStarted","Data":"974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e"} Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.916512 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.940045 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.948899 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.948878967 podStartE2EDuration="1.948878967s" podCreationTimestamp="2026-03-08 21:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:00:23.935442468 +0000 UTC m=+5325.331496581" watchObservedRunningTime="2026-03-08 21:00:23.948878967 +0000 UTC m=+5325.344933000" Mar 08 21:00:32 crc kubenswrapper[4885]: I0308 21:00:32.818234 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:00:32 crc kubenswrapper[4885]: I0308 21:00:32.818961 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.151831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.153562 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.164538 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224885 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224958 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.225163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326817 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.332441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.335663 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.347393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.354563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.507465 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.008112 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.289129 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerStarted","Data":"a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d"} Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.289209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerStarted","Data":"97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f"} Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.322269 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550061-rh8tm" podStartSLOduration=1.322242705 podStartE2EDuration="1.322242705s" podCreationTimestamp="2026-03-08 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:01:01.308650182 +0000 UTC m=+5362.704704245" watchObservedRunningTime="2026-03-08 21:01:01.322242705 +0000 UTC m=+5362.718296768" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.817873 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818229 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818277 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818885 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818992 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" gracePeriod=600 Mar 08 21:01:02 crc kubenswrapper[4885]: E0308 21:01:02.960237 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:02 crc kubenswrapper[4885]: E0308 21:01:02.975445 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e397c25_ae37_4c30_83ce_3bdb83f5b9c5.slice/crio-a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5dda3b_3e01_4bb4_af02_b0f4eeadda58.slice/crio-conmon-088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e397c25_ae37_4c30_83ce_3bdb83f5b9c5.slice/crio-conmon-a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317102 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" exitCode=0 Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317642 4885 scope.go:117] "RemoveContainer" containerID="b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.318535 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:03 crc kubenswrapper[4885]: E0308 21:01:03.318984 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.321174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerDied","Data":"a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d"} Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.321217 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerID="a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d" exitCode=0 Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.585306 4885 scope.go:117] "RemoveContainer" containerID="26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.630335 4885 scope.go:117] "RemoveContainer" containerID="9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.723375 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.809837 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2" (OuterVolumeSpecName: "kube-api-access-xsdz2") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "kube-api-access-xsdz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.823100 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.854492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.866516 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data" (OuterVolumeSpecName: "config-data") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907047 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907099 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907115 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907127 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerDied","Data":"97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f"} Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348227 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348639 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f" Mar 08 21:01:14 crc kubenswrapper[4885]: I0308 21:01:14.368740 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:14 crc kubenswrapper[4885]: E0308 21:01:14.369830 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:26 crc kubenswrapper[4885]: I0308 21:01:26.369168 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:26 crc kubenswrapper[4885]: E0308 21:01:26.369951 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.096596 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.106145 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.408913 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" path="/var/lib/kubelet/pods/1dd375ae-21ae-4fb9-87dc-f6a1a205736f/volumes" Mar 08 21:01:40 crc kubenswrapper[4885]: I0308 21:01:40.382035 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:40 crc kubenswrapper[4885]: E0308 21:01:40.383728 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:52 crc kubenswrapper[4885]: I0308 21:01:52.369479 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:52 crc kubenswrapper[4885]: E0308 21:01:52.370436 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.150062 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:00 crc kubenswrapper[4885]: E0308 21:02:00.151544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.151579 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.151950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.152768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162145 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162569 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.280375 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.382272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.402155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.479792 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.968960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:01 crc kubenswrapper[4885]: I0308 21:02:01.897426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerStarted","Data":"93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7"} Mar 08 21:02:02 crc kubenswrapper[4885]: I0308 21:02:02.922638 4885 generic.go:334] "Generic (PLEG): container finished" podID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerID="f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217" exitCode=0 Mar 08 21:02:02 crc kubenswrapper[4885]: I0308 21:02:02.922849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerDied","Data":"f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217"} Mar 08 21:02:03 crc kubenswrapper[4885]: I0308 21:02:03.368423 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:03 crc kubenswrapper[4885]: E0308 21:02:03.368868 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.425344 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.566613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"86555f65-5ef4-4c45-9ac3-9b561d985b57\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.571322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth" (OuterVolumeSpecName: "kube-api-access-tzlth") pod "86555f65-5ef4-4c45-9ac3-9b561d985b57" (UID: "86555f65-5ef4-4c45-9ac3-9b561d985b57"). InnerVolumeSpecName "kube-api-access-tzlth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.669415 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.831759 4885 scope.go:117] "RemoveContainer" containerID="d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952128 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerDied","Data":"93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7"} Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952339 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7" Mar 08 21:02:05 crc kubenswrapper[4885]: I0308 21:02:05.512680 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 21:02:05 crc kubenswrapper[4885]: I0308 21:02:05.518721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 21:02:07 crc kubenswrapper[4885]: I0308 21:02:07.386471 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" path="/var/lib/kubelet/pods/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2/volumes" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.626644 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:08 crc kubenswrapper[4885]: E0308 21:02:08.627413 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.627428 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.627635 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.628338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.633296 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674562 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.675503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.697818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.727162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.775872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.775962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.807268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.877542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.877589 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.878244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.898191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.027983 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.054703 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.490216 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.553870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:09 crc kubenswrapper[4885]: W0308 21:02:09.561091 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6b9c40_d823_4cb8_aadd_4f2aee7bd899.slice/crio-f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989 WatchSource:0}: Error finding container f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989: Status 404 returned error can't find the container with id f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989 Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.997829 4885 generic.go:334] "Generic (PLEG): container finished" podID="379d344d-9828-4fae-a4f4-5712113f506d" containerID="005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee" exitCode=0 Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.998008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerDied","Data":"005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee"} Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.998209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerStarted","Data":"a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed"} Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001427 4885 generic.go:334] "Generic (PLEG): container finished" podID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerID="5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586" exitCode=0 Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerDied","Data":"5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586"} Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerStarted","Data":"f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989"} Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.467028 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.474014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643108 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"379d344d-9828-4fae-a4f4-5712113f506d\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"379d344d-9828-4fae-a4f4-5712113f506d\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.644228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df6b9c40-d823-4cb8-aadd-4f2aee7bd899" (UID: "df6b9c40-d823-4cb8-aadd-4f2aee7bd899"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.644378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "379d344d-9828-4fae-a4f4-5712113f506d" (UID: "379d344d-9828-4fae-a4f4-5712113f506d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.653168 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7" (OuterVolumeSpecName: "kube-api-access-g9zq7") pod "df6b9c40-d823-4cb8-aadd-4f2aee7bd899" (UID: "df6b9c40-d823-4cb8-aadd-4f2aee7bd899"). InnerVolumeSpecName "kube-api-access-g9zq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.656330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr" (OuterVolumeSpecName: "kube-api-access-kt9gr") pod "379d344d-9828-4fae-a4f4-5712113f506d" (UID: "379d344d-9828-4fae-a4f4-5712113f506d"). InnerVolumeSpecName "kube-api-access-kt9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745397 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745460 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745476 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745491 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerDied","Data":"a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed"} Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020529 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020654 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.023797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerDied","Data":"f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989"} Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.024030 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.023892 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.993346 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:13 crc kubenswrapper[4885]: E0308 21:02:13.993972 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.993987 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: E0308 21:02:13.994026 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994034 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994238 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994836 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.997586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.999537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bnc8w" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.021083 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.300901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.301411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.315340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.360872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.807337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.079781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerStarted","Data":"0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa"} Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.079852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerStarted","Data":"c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d"} Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.103256 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lkccb" podStartSLOduration=2.103235845 podStartE2EDuration="2.103235845s" podCreationTimestamp="2026-03-08 21:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:15.098095267 +0000 UTC m=+5436.494149320" watchObservedRunningTime="2026-03-08 21:02:15.103235845 +0000 UTC m=+5436.499289868" Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.369187 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:15 crc kubenswrapper[4885]: E0308 21:02:15.370693 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:17 crc kubenswrapper[4885]: I0308 21:02:17.105768 4885 generic.go:334] "Generic (PLEG): container finished" podID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerID="0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa" exitCode=0 Mar 08 21:02:17 crc kubenswrapper[4885]: I0308 21:02:17.105981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerDied","Data":"0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa"} Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.545695 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.697671 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.698122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.698200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.705113 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5" (OuterVolumeSpecName: "kube-api-access-6mzn5") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "kube-api-access-6mzn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.705863 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.747120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800383 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800433 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800505 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerDied","Data":"c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d"} Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133566 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133613 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.448993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:19 crc kubenswrapper[4885]: E0308 21:02:19.449435 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.449456 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.449664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.450752 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.452595 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bnc8w" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.455031 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.462537 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.465743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.471216 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.476156 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.479049 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.507239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.550341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.551505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.578226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.643614 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.648819 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.651283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.659357 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.723827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724240 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724336 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724545 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724620 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.725745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.725832 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.730694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.734917 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.740696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.741363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.742971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.743153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.743642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.747568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.777195 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.801565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827044 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827370 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827503 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.830098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.830268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.836030 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.836346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.845573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.874008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.928652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929172 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.930696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.937011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.938016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.941681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.957467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.978502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.289321 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.302516 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:20 crc kubenswrapper[4885]: W0308 21:02:20.303669 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfd95bc_213c_417c_8dd5_b66637bd98e9.slice/crio-47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88 WatchSource:0}: Error finding container 47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88: Status 404 returned error can't find the container with id 47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88 Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.311357 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:20 crc kubenswrapper[4885]: W0308 21:02:20.435210 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b9b5286_1ced_4445_964d_2ec8fc6a17a4.slice/crio-926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387 WatchSource:0}: Error finding container 926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387: Status 404 returned error can't find the container with id 926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387 Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.442423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"4f39bd047d3fe77ffdac047ba95989e94eead27192344f44db80bb4a53e9d268"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627851 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"72c7e26c6898ff8d593f798cd90453e0d384a2a77d538a79ee7908646c927680"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.629027 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.629051 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"2ad3124b0caf02798cc751c8df70dda6dec35a8c8734b02d64f94638250471d6"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650689 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"7a62561ad7c35cf5a96cff3d24814864ea4a271a0749070d1dd9157fcb795d18"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"52e0a15f8359f92d82098188b2c46cadd052b25468209f9fb6556966e8f26965"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.671898 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"47f2a26668a9b3bc5bfd6a84075276d605a5652cb0215d6064a55085e547dbdf"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.672035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"0b350cd53edac40273754109de42b3373bc1cf10069ee5e6f8919617e766522b"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.672053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"a0f3d48b601ce1645c913dfe43e92db55ad2b11eed1e29579690130f9b487a52"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.675528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8469b78fd4-9xh8z" podStartSLOduration=2.6754955049999998 podStartE2EDuration="2.675495505s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.663495664 +0000 UTC m=+5443.059549727" watchObservedRunningTime="2026-03-08 21:02:21.675495505 +0000 UTC m=+5443.071549558" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.681396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.681439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.713773 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" podStartSLOduration=2.713743054 podStartE2EDuration="2.713743054s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.704429315 +0000 UTC m=+5443.100483348" watchObservedRunningTime="2026-03-08 21:02:21.713743054 +0000 UTC m=+5443.109797087" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.736837 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-755df7d9d5-kl4vq" podStartSLOduration=2.73681872 podStartE2EDuration="2.73681872s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.734696223 +0000 UTC m=+5443.130750266" watchObservedRunningTime="2026-03-08 21:02:21.73681872 +0000 UTC m=+5443.132872743" Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.692386 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" exitCode=0 Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.694578 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.695046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.695770 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.719654 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" podStartSLOduration=3.719635277 podStartE2EDuration="3.719635277s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:22.718157357 +0000 UTC m=+5444.114211390" watchObservedRunningTime="2026-03-08 21:02:22.719635277 +0000 UTC m=+5444.115689320" Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.876417 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.992165 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.992465 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" containerID="cri-o://c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" gracePeriod=10 Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.368027 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.420719 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560264 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560413 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.568290 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q" (OuterVolumeSpecName: "kube-api-access-g2d7q") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "kube-api-access-g2d7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.632817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config" (OuterVolumeSpecName: "config") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.633480 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.635150 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.639385 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662031 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662068 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662078 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662087 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662096 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770034 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" exitCode=0 Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7"} Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770129 4885 scope.go:117] "RemoveContainer" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770264 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.810128 4885 scope.go:117] "RemoveContainer" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.831829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.846008 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.848883 4885 scope.go:117] "RemoveContainer" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.849473 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": container with ID starting with c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0 not found: ID does not exist" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.849509 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} err="failed to get container status \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": rpc error: code = NotFound desc = could not find container \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": container with ID starting with c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0 not found: ID does not exist" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.849534 4885 scope.go:117] "RemoveContainer" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.850339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": container with ID starting with d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9 not found: ID does not exist" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.850400 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9"} err="failed to get container status \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": rpc error: code = NotFound desc = could not find container \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": container with ID starting with d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9 not found: ID does not exist" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.333697 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.342673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.383211 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" path="/var/lib/kubelet/pods/c4efef13-d123-4300-b581-2a9a52de6d1b/volumes" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.368081 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.370239 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693537 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.693863 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="init" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693880 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="init" Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.693903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693911 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.694097 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.694595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.711766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.736856 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.737852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.741910 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.760344 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.830889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.830982 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.831010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.831269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.934007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.934483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.964089 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.971742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.027297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.060066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.528986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.607026 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.915789 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerStarted","Data":"e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.915854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerStarted","Data":"d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918188 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerID="71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc" exitCode=0 Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918265 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerDied","Data":"71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918312 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerStarted","Data":"184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.939575 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b046-account-create-update-7xtx9" podStartSLOduration=1.939541539 podStartE2EDuration="1.939541539s" podCreationTimestamp="2026-03-08 21:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:44.929512532 +0000 UTC m=+5466.325566585" watchObservedRunningTime="2026-03-08 21:02:44.939541539 +0000 UTC m=+5466.335595602" Mar 08 21:02:45 crc kubenswrapper[4885]: I0308 21:02:45.932991 4885 generic.go:334] "Generic (PLEG): container finished" podID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerID="e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e" exitCode=0 Mar 08 21:02:45 crc kubenswrapper[4885]: I0308 21:02:45.934100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerDied","Data":"e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e"} Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.312395 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.388031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"0e0914c1-cac8-4c2d-bbe4-615218170f10\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.388105 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"0e0914c1-cac8-4c2d-bbe4-615218170f10\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.389432 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e0914c1-cac8-4c2d-bbe4-615218170f10" (UID: "0e0914c1-cac8-4c2d-bbe4-615218170f10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.394629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z" (OuterVolumeSpecName: "kube-api-access-xfw2z") pod "0e0914c1-cac8-4c2d-bbe4-615218170f10" (UID: "0e0914c1-cac8-4c2d-bbe4-615218170f10"). InnerVolumeSpecName "kube-api-access-xfw2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.490208 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.490998 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945068 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945080 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerDied","Data":"184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26"} Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945205 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.387640 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.507738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"d6ea4544-00f0-4646-a598-1efa92af4e49\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.507900 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"d6ea4544-00f0-4646-a598-1efa92af4e49\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.509876 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6ea4544-00f0-4646-a598-1efa92af4e49" (UID: "d6ea4544-00f0-4646-a598-1efa92af4e49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.513292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b" (OuterVolumeSpecName: "kube-api-access-ts69b") pod "d6ea4544-00f0-4646-a598-1efa92af4e49" (UID: "d6ea4544-00f0-4646-a598-1efa92af4e49"). InnerVolumeSpecName "kube-api-access-ts69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.610665 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.610703 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.981832 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerDied","Data":"d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650"} Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.981897 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.982044 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.066944 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:49 crc kubenswrapper[4885]: E0308 21:02:49.067456 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067478 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: E0308 21:02:49.067500 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067510 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067820 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067865 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.068676 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.070571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.071557 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.071880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-894h5" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.089447 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244660 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.250950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.256925 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.264998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.395958 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.869181 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:50 crc kubenswrapper[4885]: I0308 21:02:50.007098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerStarted","Data":"105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641"} Mar 08 21:02:51 crc kubenswrapper[4885]: I0308 21:02:51.024452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerStarted","Data":"bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef"} Mar 08 21:02:51 crc kubenswrapper[4885]: I0308 21:02:51.056995 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-th6xc" podStartSLOduration=2.056976089 podStartE2EDuration="2.056976089s" podCreationTimestamp="2026-03-08 21:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:51.05359913 +0000 UTC m=+5472.449653213" watchObservedRunningTime="2026-03-08 21:02:51.056976089 +0000 UTC m=+5472.453030122" Mar 08 21:02:54 crc kubenswrapper[4885]: I0308 21:02:54.058332 4885 generic.go:334] "Generic (PLEG): container finished" podID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerID="bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef" exitCode=0 Mar 08 21:02:54 crc kubenswrapper[4885]: I0308 21:02:54.058402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerDied","Data":"bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef"} Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.511046 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.690640 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.690849 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.691023 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.704272 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925" (OuterVolumeSpecName: "kube-api-access-4r925") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "kube-api-access-4r925". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.730248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.731064 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config" (OuterVolumeSpecName: "config") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793098 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793131 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793143 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerDied","Data":"105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641"} Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099224 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.248762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:56 crc kubenswrapper[4885]: E0308 21:02:56.249197 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.249222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.249450 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.250468 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.258645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.368146 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:56 crc kubenswrapper[4885]: E0308 21:02:56.368412 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412257 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412512 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.470344 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.471596 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473475 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-894h5" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.488876 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514588 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514616 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515477 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.516388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.544313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.571270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616923 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.617006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.719034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.728928 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.730329 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.751588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.758591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.792654 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.055289 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.107913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerStarted","Data":"68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483"} Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.447288 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:57 crc kubenswrapper[4885]: W0308 21:02:57.451405 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68cdeb73_eb92_4d18_8a9f_a5e3a0a53900.slice/crio-447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf WatchSource:0}: Error finding container 447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf: Status 404 returned error can't find the container with id 447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.139683 4885 generic.go:334] "Generic (PLEG): container finished" podID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerID="22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56" exitCode=0 Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.139769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"8a20111a453a7ae4487d2f4c906c192d92d9594ce01edaf2227dc794150134f6"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"d965a4c3078e1bbdfe2f1bf34f64c1819b3f6cc7a0fb71576d2de428ca874ef1"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143824 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.192212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67c4b97569-rrjw7" podStartSLOduration=2.192188071 podStartE2EDuration="2.192188071s" podCreationTimestamp="2026-03-08 21:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:58.185974104 +0000 UTC m=+5479.582028127" watchObservedRunningTime="2026-03-08 21:02:58.192188071 +0000 UTC m=+5479.588242094" Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.157021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerStarted","Data":"a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c"} Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.157255 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.197751 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" podStartSLOduration=3.197723683 podStartE2EDuration="3.197723683s" podCreationTimestamp="2026-03-08 21:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:59.188259891 +0000 UTC m=+5480.584313964" watchObservedRunningTime="2026-03-08 21:02:59.197723683 +0000 UTC m=+5480.593777746" Mar 08 21:03:04 crc kubenswrapper[4885]: I0308 21:03:04.905502 4885 scope.go:117] "RemoveContainer" containerID="06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1" Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.573203 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.662831 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.663336 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" containerID="cri-o://ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" gracePeriod=10 Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.162308 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238591 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" exitCode=0 Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238648 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238878 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.239065 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387"} Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.239142 4885 scope.go:117] "RemoveContainer" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.265584 4885 scope.go:117] "RemoveContainer" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.284766 4885 scope.go:117] "RemoveContainer" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: E0308 21:03:07.285388 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": container with ID starting with ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a not found: ID does not exist" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285460 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} err="failed to get container status \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": rpc error: code = NotFound desc = could not find container \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": container with ID starting with ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a not found: ID does not exist" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285504 4885 scope.go:117] "RemoveContainer" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: E0308 21:03:07.285899 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": container with ID starting with f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729 not found: ID does not exist" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285967 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} err="failed to get container status \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": rpc error: code = NotFound desc = could not find container \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": container with ID starting with f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729 not found: ID does not exist" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.318834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.318967 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.331147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn" (OuterVolumeSpecName: "kube-api-access-xhppn") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "kube-api-access-xhppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.359255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config" (OuterVolumeSpecName: "config") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.366365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.372823 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.373576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421351 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421387 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421396 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421404 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421413 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.574910 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.592563 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:08 crc kubenswrapper[4885]: I0308 21:03:08.368748 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:08 crc kubenswrapper[4885]: E0308 21:03:08.369355 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:09 crc kubenswrapper[4885]: I0308 21:03:09.376987 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" path="/var/lib/kubelet/pods/3b9b5286-1ced-4445-964d-2ec8fc6a17a4/volumes" Mar 08 21:03:20 crc kubenswrapper[4885]: I0308 21:03:20.368052 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:20 crc kubenswrapper[4885]: E0308 21:03:20.369223 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:26 crc kubenswrapper[4885]: I0308 21:03:26.803673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.115582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:34 crc kubenswrapper[4885]: E0308 21:03:34.116779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.116796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: E0308 21:03:34.116816 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="init" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.116821 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="init" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.117103 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.117658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.178449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.210969 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.212399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.215954 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.218306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.311909 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312055 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.431854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.433727 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.434145 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.535141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.905430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.009017 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:35 crc kubenswrapper[4885]: W0308 21:03:35.019320 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7ace920_d540_4598_82db_315caa467acb.slice/crio-09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5 WatchSource:0}: Error finding container 09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5: Status 404 returned error can't find the container with id 09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.368800 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:35 crc kubenswrapper[4885]: E0308 21:03:35.369345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560030 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7ace920-d540-4598-82db-315caa467acb" containerID="dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67" exitCode=0 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerDied","Data":"dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerStarted","Data":"09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.562848 4885 generic.go:334] "Generic (PLEG): container finished" podID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerID="a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af" exitCode=0 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.562960 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerDied","Data":"a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.563008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerStarted","Data":"99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8"} Mar 08 21:03:36 crc kubenswrapper[4885]: I0308 21:03:36.992344 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:36 crc kubenswrapper[4885]: I0308 21:03:36.997547 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171098 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171286 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"e7ace920-d540-4598-82db-315caa467acb\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"e7ace920-d540-4598-82db-315caa467acb\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.175344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7ace920-d540-4598-82db-315caa467acb" (UID: "e7ace920-d540-4598-82db-315caa467acb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.177514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" (UID: "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.185843 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd" (OuterVolumeSpecName: "kube-api-access-4j9dd") pod "e7ace920-d540-4598-82db-315caa467acb" (UID: "e7ace920-d540-4598-82db-315caa467acb"). InnerVolumeSpecName "kube-api-access-4j9dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.190604 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4" (OuterVolumeSpecName: "kube-api-access-vxsm4") pod "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" (UID: "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d"). InnerVolumeSpecName "kube-api-access-vxsm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.273863 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274099 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274165 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274258 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerDied","Data":"09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5"} Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581616 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583354 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerDied","Data":"99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8"} Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583415 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583498 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.440690 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:39 crc kubenswrapper[4885]: E0308 21:03:39.441263 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441287 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: E0308 21:03:39.441321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441336 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441656 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441703 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.442657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.445222 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.453870 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.455265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524547 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.633026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.649657 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.650849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.656492 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.769284 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:40 crc kubenswrapper[4885]: I0308 21:03:40.126243 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:40 crc kubenswrapper[4885]: I0308 21:03:40.613530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerStarted","Data":"fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697"} Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.438129 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.442715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.453023 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561167 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.627401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerStarted","Data":"238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95"} Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.664017 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.686083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.775318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.368006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m792h" podStartSLOduration=3.367979201 podStartE2EDuration="3.367979201s" podCreationTimestamp="2026-03-08 21:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:41.648332945 +0000 UTC m=+5523.044386978" watchObservedRunningTime="2026-03-08 21:03:42.367979201 +0000 UTC m=+5523.764033254" Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.377036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:42 crc kubenswrapper[4885]: W0308 21:03:42.380048 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e84cb88_21d6_41d6_9352_b29d1953fa9f.slice/crio-9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05 WatchSource:0}: Error finding container 9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05: Status 404 returned error can't find the container with id 9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05 Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645298 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" exitCode=0 Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140"} Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05"} Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.656653 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:03:43 crc kubenswrapper[4885]: I0308 21:03:43.658107 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.677195 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" exitCode=0 Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.677570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.684642 4885 generic.go:334] "Generic (PLEG): container finished" podID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerID="238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95" exitCode=0 Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.684696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerDied","Data":"238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95"} Mar 08 21:03:45 crc kubenswrapper[4885]: I0308 21:03:45.699053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} Mar 08 21:03:45 crc kubenswrapper[4885]: I0308 21:03:45.735331 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ljp6s" podStartSLOduration=2.194778705 podStartE2EDuration="4.735305668s" podCreationTimestamp="2026-03-08 21:03:41 +0000 UTC" firstStartedPulling="2026-03-08 21:03:42.652623104 +0000 UTC m=+5524.048677167" lastFinishedPulling="2026-03-08 21:03:45.193150097 +0000 UTC m=+5526.589204130" observedRunningTime="2026-03-08 21:03:45.729572865 +0000 UTC m=+5527.125626948" watchObservedRunningTime="2026-03-08 21:03:45.735305668 +0000 UTC m=+5527.131359701" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.112649 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255097 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255178 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.261388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9" (OuterVolumeSpecName: "kube-api-access-mngp9") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "kube-api-access-mngp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.261425 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.304408 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.339642 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data" (OuterVolumeSpecName: "config-data") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358375 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358402 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358416 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358427 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.706678 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerDied","Data":"fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697"} Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.707029 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.706739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:46 crc kubenswrapper[4885]: E0308 21:03:46.918490 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07fa5fd1_f6b9_4206_809d_c1f04533cab4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07fa5fd1_f6b9_4206_809d_c1f04533cab4.slice/crio-fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697\": RecentStats: unable to find data in memory cache]" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993138 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:46 crc kubenswrapper[4885]: E0308 21:03:46.993487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993504 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993667 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.994469 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.999314 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.999840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.000085 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.001078 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.010328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075952 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.126709 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.127978 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.149348 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177778 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177978 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178023 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.179199 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.186529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.187208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.190290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.192907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.205561 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.220443 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.221740 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.231084 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.242751 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.281945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282128 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.321290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385250 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385303 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.386334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.387100 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.387590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.395427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.452800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.454581 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488809 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488902 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.494758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.503529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.504351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.504588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.507901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.518934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.527243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.593474 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.936381 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: W0308 21:03:47.942034 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfd0414_9452_430e_a4ea_bb57121b0424.slice/crio-5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1 WatchSource:0}: Error finding container 5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1: Status 404 returned error can't find the container with id 5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.027671 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.223818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.237414 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:48 crc kubenswrapper[4885]: W0308 21:03:48.238562 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a6f23b6_b553_46eb_9a92_f33f68a80294.slice/crio-96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31 WatchSource:0}: Error finding container 96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31: Status 404 returned error can't find the container with id 96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.736336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738580 4885 generic.go:334] "Generic (PLEG): container finished" podID="a16f7c8b-b930-4591-a967-9db46c52391c" containerID="1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8" exitCode=0 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerStarted","Data":"1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.744862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.744887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.374550 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:49 crc kubenswrapper[4885]: E0308 21:03:49.375047 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.754257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.754312 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.756017 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerStarted","Data":"7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.756431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758536 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" containerID="cri-o://440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" gracePeriod=30 Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758756 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" containerID="cri-o://d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" gracePeriod=30 Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.787344 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.787292368 podStartE2EDuration="2.787292368s" podCreationTimestamp="2026-03-08 21:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.78546554 +0000 UTC m=+5531.181519573" watchObservedRunningTime="2026-03-08 21:03:49.787292368 +0000 UTC m=+5531.183346441" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.808133 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.808115994 podStartE2EDuration="3.808115994s" podCreationTimestamp="2026-03-08 21:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.803128521 +0000 UTC m=+5531.199182554" watchObservedRunningTime="2026-03-08 21:03:49.808115994 +0000 UTC m=+5531.204170027" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.828494 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54fd489df-k9th6" podStartSLOduration=2.828474597 podStartE2EDuration="2.828474597s" podCreationTimestamp="2026-03-08 21:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.826252287 +0000 UTC m=+5531.222306310" watchObservedRunningTime="2026-03-08 21:03:49.828474597 +0000 UTC m=+5531.224528640" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.032894 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.296239 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.449781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.449969 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450023 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450049 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450157 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450195 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450863 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs" (OuterVolumeSpecName: "logs") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.451060 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.456562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9" (OuterVolumeSpecName: "kube-api-access-rbdl9") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "kube-api-access-rbdl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.456621 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts" (OuterVolumeSpecName: "scripts") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.458226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph" (OuterVolumeSpecName: "ceph") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.489555 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.503183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data" (OuterVolumeSpecName: "config-data") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552408 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552450 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552463 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552473 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552481 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552489 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767634 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" exitCode=0 Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767666 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" exitCode=143 Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767690 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767715 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767791 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.791411 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.806789 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.824299 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838120 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.838687 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838717 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.838746 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838758 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.839041 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.839067 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.840125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.841216 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842287 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.842896 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842944 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} err="failed to get container status \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842968 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.843761 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.843816 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} err="failed to get container status \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.843842 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.847481 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} err="failed to get container status \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.847531 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.861256 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} err="failed to get container status \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.870841 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966885 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.967042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.967120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069546 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.070624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.070763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.074967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.075637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.076113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.076392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.087811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.171265 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.381323 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" path="/var/lib/kubelet/pods/cdfd0414-9452-430e-a4ea-bb57121b0424/volumes" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.753963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.776376 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.776501 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.785831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"8a73a30c989c3b2ba3367e2ccdac633bed1d9b688e4b2e65328b8a2f07a6fe3b"} Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.786193 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" containerID="cri-o://33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" gracePeriod=30 Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.786281 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" containerID="cri-o://6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" gracePeriod=30 Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.841858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.350332 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497640 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497699 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497718 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.498023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.498773 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts" (OuterVolumeSpecName: "scripts") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501269 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp" (OuterVolumeSpecName: "kube-api-access-ptbgp") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "kube-api-access-ptbgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs" (OuterVolumeSpecName: "logs") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.509244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph" (OuterVolumeSpecName: "ceph") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.530520 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.548946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data" (OuterVolumeSpecName: "config-data") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600428 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600474 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600491 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600505 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600518 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600531 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812304 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" exitCode=0 Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812349 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" exitCode=143 Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813984 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.814005 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.823586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.870753 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.874341 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.877137 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.893641 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.894303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894323 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.894337 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894343 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894508 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894522 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.895373 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.897499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.907611 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.910820 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.910890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.911476 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911509 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} err="failed to get container status \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911533 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.911872 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911897 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} err="failed to get container status \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911911 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912400 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} err="failed to get container status \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912437 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912737 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} err="failed to get container status \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.979573 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006671 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.110319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.112684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.114136 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.116137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.117358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.125457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.236811 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.384498 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" path="/var/lib/kubelet/pods/0a6f23b6-b553-46eb-9a92-f33f68a80294/volumes" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.805804 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:53 crc kubenswrapper[4885]: W0308 21:03:53.824525 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf82d5c8f_6a14_4b1b_9143_eb52cf7e67e8.slice/crio-c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230 WatchSource:0}: Error finding container c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230: Status 404 returned error can't find the container with id c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230 Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.835802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230"} Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.838414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4"} Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.860804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.860788382 podStartE2EDuration="3.860788382s" podCreationTimestamp="2026-03-08 21:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:53.858248874 +0000 UTC m=+5535.254302887" watchObservedRunningTime="2026-03-08 21:03:53.860788382 +0000 UTC m=+5535.256842405" Mar 08 21:03:54 crc kubenswrapper[4885]: I0308 21:03:54.863528 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ljp6s" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" containerID="cri-o://bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" gracePeriod=2 Mar 08 21:03:54 crc kubenswrapper[4885]: I0308 21:03:54.864016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.355491 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472745 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472771 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472893 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities" (OuterVolumeSpecName: "utilities") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.473617 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.480423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2" (OuterVolumeSpecName: "kube-api-access-nwpp2") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "kube-api-access-nwpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.520119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.576076 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.576139 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.878790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886706 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" exitCode=0 Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886852 4885 scope.go:117] "RemoveContainer" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.887137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.914627 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.9145978919999997 podStartE2EDuration="3.914597892s" podCreationTimestamp="2026-03-08 21:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:55.909387763 +0000 UTC m=+5537.305441866" watchObservedRunningTime="2026-03-08 21:03:55.914597892 +0000 UTC m=+5537.310651945" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.925728 4885 scope.go:117] "RemoveContainer" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.956133 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.965446 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.977820 4885 scope.go:117] "RemoveContainer" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.016707 4885 scope.go:117] "RemoveContainer" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.017377 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": container with ID starting with bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e not found: ID does not exist" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017409 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} err="failed to get container status \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": rpc error: code = NotFound desc = could not find container \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": container with ID starting with bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e not found: ID does not exist" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017432 4885 scope.go:117] "RemoveContainer" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.017803 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": container with ID starting with 1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c not found: ID does not exist" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017830 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} err="failed to get container status \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": rpc error: code = NotFound desc = could not find container \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": container with ID starting with 1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c not found: ID does not exist" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017850 4885 scope.go:117] "RemoveContainer" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.018328 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": container with ID starting with 18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140 not found: ID does not exist" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.018354 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140"} err="failed to get container status \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": rpc error: code = NotFound desc = could not find container \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": container with ID starting with 18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140 not found: ID does not exist" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.387846 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" path="/var/lib/kubelet/pods/0e84cb88-21d6-41d6-9352-b29d1953fa9f/volumes" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.457114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.516289 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.516710 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" containerID="cri-o://a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" gracePeriod=10 Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915075 4885 generic.go:334] "Generic (PLEG): container finished" podID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerID="a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" exitCode=0 Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c"} Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915353 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483"} Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915369 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.989720 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.128875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.137830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns" (OuterVolumeSpecName: "kube-api-access-vvkns") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "kube-api-access-vvkns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.193001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.193141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config" (OuterVolumeSpecName: "config") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.209532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.223290 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232434 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232482 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232501 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232518 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232535 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.924306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.967196 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.975340 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:59 crc kubenswrapper[4885]: I0308 21:03:59.387652 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" path="/var/lib/kubelet/pods/25b65d53-5174-48c6-a687-f32c1e685bd4/volumes" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.157837 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158389 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="init" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="init" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158434 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158481 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158499 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-content" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158539 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-content" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158577 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-utilities" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158590 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-utilities" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158915 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.159029 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.161167 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.163984 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.164302 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.164550 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.177367 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.271625 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.373965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.412874 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.493418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.893162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.957151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerStarted","Data":"41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4"} Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.172467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.172590 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.223518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.239985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.369489 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:01 crc kubenswrapper[4885]: E0308 21:04:01.370128 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.968871 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.968965 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:04:02 crc kubenswrapper[4885]: I0308 21:04:02.985872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerStarted","Data":"d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6"} Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.015009 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550064-54cxw" podStartSLOduration=1.518519096 podStartE2EDuration="3.014984882s" podCreationTimestamp="2026-03-08 21:04:00 +0000 UTC" firstStartedPulling="2026-03-08 21:04:00.901304663 +0000 UTC m=+5542.297358726" lastFinishedPulling="2026-03-08 21:04:02.397770459 +0000 UTC m=+5543.793824512" observedRunningTime="2026-03-08 21:04:03.008726065 +0000 UTC m=+5544.404780118" watchObservedRunningTime="2026-03-08 21:04:03.014984882 +0000 UTC m=+5544.411038935" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.237570 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.237648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.282001 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.311898 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.906304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.907405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.001372 4885 generic.go:334] "Generic (PLEG): container finished" podID="94b5a791-d720-4a5c-9138-abe584a56755" containerID="d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6" exitCode=0 Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.001516 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerDied","Data":"d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6"} Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.002098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.002143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.348054 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.483218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"94b5a791-d720-4a5c-9138-abe584a56755\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.521147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm" (OuterVolumeSpecName: "kube-api-access-tzngm") pod "94b5a791-d720-4a5c-9138-abe584a56755" (UID: "94b5a791-d720-4a5c-9138-abe584a56755"). InnerVolumeSpecName "kube-api-access-tzngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.587809 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.907520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.942104 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028334 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerDied","Data":"41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4"} Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028426 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.085107 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.091318 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 21:04:07 crc kubenswrapper[4885]: I0308 21:04:07.377381 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" path="/var/lib/kubelet/pods/95f14f7f-4dec-4d9d-a320-7a5c927d4983/volumes" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.048570 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: E0308 21:04:14.049431 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.049443 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.049584 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.050118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.068947 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.148007 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.170653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.174321 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.175496 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.175562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.186987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.277814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.277903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.278059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.278165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.279204 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.312211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.373535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.380221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.380485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.381763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.408577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.510273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.928178 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: W0308 21:04:14.935187 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14dd4829_951f_4e19_885f_f466dcbf9d1b.slice/crio-bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687 WatchSource:0}: Error finding container bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687: Status 404 returned error can't find the container with id bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687 Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.003210 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.148020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerStarted","Data":"38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.148068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerStarted","Data":"bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.150515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerStarted","Data":"3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.162996 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-txw9w" podStartSLOduration=1.162977288 podStartE2EDuration="1.162977288s" podCreationTimestamp="2026-03-08 21:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:15.158851918 +0000 UTC m=+5556.554905961" watchObservedRunningTime="2026-03-08 21:04:15.162977288 +0000 UTC m=+5556.559031321" Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.177442 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5511-account-create-update-fvjhm" podStartSLOduration=1.177420873 podStartE2EDuration="1.177420873s" podCreationTimestamp="2026-03-08 21:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:15.173749586 +0000 UTC m=+5556.569803609" watchObservedRunningTime="2026-03-08 21:04:15.177420873 +0000 UTC m=+5556.573474916" Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.369127 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:15 crc kubenswrapper[4885]: E0308 21:04:15.369652 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.175264 4885 generic.go:334] "Generic (PLEG): container finished" podID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerID="5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105" exitCode=0 Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.175329 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerDied","Data":"5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105"} Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.179393 4885 generic.go:334] "Generic (PLEG): container finished" podID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerID="38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59" exitCode=0 Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.179477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerDied","Data":"38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59"} Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.582782 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.636630 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"14dd4829-951f-4e19-885f-f466dcbf9d1b\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.636681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"14dd4829-951f-4e19-885f-f466dcbf9d1b\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.640497 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14dd4829-951f-4e19-885f-f466dcbf9d1b" (UID: "14dd4829-951f-4e19-885f-f466dcbf9d1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.664942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb" (OuterVolumeSpecName: "kube-api-access-vzjsb") pod "14dd4829-951f-4e19-885f-f466dcbf9d1b" (UID: "14dd4829-951f-4e19-885f-f466dcbf9d1b"). InnerVolumeSpecName "kube-api-access-vzjsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.695637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.740607 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"8664af8f-0cf2-4ef8-a701-adbaba058240\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.740936 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"8664af8f-0cf2-4ef8-a701-adbaba058240\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741355 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741372 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8664af8f-0cf2-4ef8-a701-adbaba058240" (UID: "8664af8f-0cf2-4ef8-a701-adbaba058240"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.743770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d" (OuterVolumeSpecName: "kube-api-access-g775d") pod "8664af8f-0cf2-4ef8-a701-adbaba058240" (UID: "8664af8f-0cf2-4ef8-a701-adbaba058240"). InnerVolumeSpecName "kube-api-access-g775d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.843255 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.843285 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.214455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerDied","Data":"bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687"} Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.214530 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.215112 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217861 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerDied","Data":"3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4"} Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217951 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.498816 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:19 crc kubenswrapper[4885]: E0308 21:04:19.499455 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499468 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: E0308 21:04:19.499491 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499497 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499699 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.500644 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.505126 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.506294 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.509696 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbkt5" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.511283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.512556 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.526831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.534426 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685668 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685721 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786832 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786877 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.788121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.788449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.792383 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.797459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.797626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.810519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.818912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.824704 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.834804 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:20 crc kubenswrapper[4885]: I0308 21:04:20.308937 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:20 crc kubenswrapper[4885]: W0308 21:04:20.392390 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18bb9fd_f7a3_4935_9d89_26654d7e08c5.slice/crio-b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99 WatchSource:0}: Error finding container b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99: Status 404 returned error can't find the container with id b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99 Mar 08 21:04:20 crc kubenswrapper[4885]: I0308 21:04:20.398977 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.248994 4885 generic.go:334] "Generic (PLEG): container finished" podID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" exitCode=0 Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.249257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.249469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerStarted","Data":"b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.255301 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerStarted","Data":"ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.255363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerStarted","Data":"e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.301504 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-q8g6n" podStartSLOduration=2.301483842 podStartE2EDuration="2.301483842s" podCreationTimestamp="2026-03-08 21:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:21.294078184 +0000 UTC m=+5562.690132217" watchObservedRunningTime="2026-03-08 21:04:21.301483842 +0000 UTC m=+5562.697537875" Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.266661 4885 generic.go:334] "Generic (PLEG): container finished" podID="1aad146d-597d-436f-ba72-59a57f223ad0" containerID="ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f" exitCode=0 Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.266773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerDied","Data":"ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f"} Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.270025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerStarted","Data":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.270431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.315774 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" podStartSLOduration=3.315747106 podStartE2EDuration="3.315747106s" podCreationTimestamp="2026-03-08 21:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:22.30843788 +0000 UTC m=+5563.704491943" watchObservedRunningTime="2026-03-08 21:04:22.315747106 +0000 UTC m=+5563.711801159" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.697784 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867863 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867933 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.868004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.868035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.870015 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs" (OuterVolumeSpecName: "logs") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.876680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd" (OuterVolumeSpecName: "kube-api-access-qffxd") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "kube-api-access-qffxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.893049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts" (OuterVolumeSpecName: "scripts") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.900795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data" (OuterVolumeSpecName: "config-data") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.912068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971391 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971609 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971734 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971853 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.972059 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerDied","Data":"e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017"} Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295185 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295265 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.903747 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:24 crc kubenswrapper[4885]: E0308 21:04:24.904332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.904355 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.904648 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.906186 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.909094 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.910418 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.912279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbkt5" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.919598 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.093887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094519 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196444 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196522 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196631 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.197031 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.200842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.202593 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.203153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.219176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.251470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.716744 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:25 crc kubenswrapper[4885]: W0308 21:04:25.718295 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0741bee5_7932_4af4_a8c1_1e56b754e359.slice/crio-eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d WatchSource:0}: Error finding container eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d: Status 404 returned error can't find the container with id eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"42dac47964c4a24ba20aabbedfa225db530aeffae1a720e47c3e106ce647762f"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"f4f9f28f407cb131f6b8101d3af1ec3565049a3a9de862df15a7783034d3e20a"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316712 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.361801 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bc985567d-hcdbz" podStartSLOduration=2.361772726 podStartE2EDuration="2.361772726s" podCreationTimestamp="2026-03-08 21:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:26.341979248 +0000 UTC m=+5567.738033311" watchObservedRunningTime="2026-03-08 21:04:26.361772726 +0000 UTC m=+5567.757826779" Mar 08 21:04:28 crc kubenswrapper[4885]: I0308 21:04:28.368192 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:28 crc kubenswrapper[4885]: E0308 21:04:28.370734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.826386 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.929518 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.929800 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54fd489df-k9th6" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" containerID="cri-o://7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" gracePeriod=10 Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.359713 4885 generic.go:334] "Generic (PLEG): container finished" podID="a16f7c8b-b930-4591-a967-9db46c52391c" containerID="7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" exitCode=0 Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4"} Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0"} Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360058 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.395406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517635 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517898 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.565287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt" (OuterVolumeSpecName: "kube-api-access-276lt") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "kube-api-access-276lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.592166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.621005 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.621038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.634517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.644423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.656430 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config" (OuterVolumeSpecName: "config") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722124 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722155 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722166 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.369637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.432687 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.444090 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:33 crc kubenswrapper[4885]: I0308 21:04:33.387370 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" path="/var/lib/kubelet/pods/a16f7c8b-b930-4591-a967-9db46c52391c/volumes" Mar 08 21:04:43 crc kubenswrapper[4885]: I0308 21:04:43.369183 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:43 crc kubenswrapper[4885]: E0308 21:04:43.372197 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.284504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.298796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.369026 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:56 crc kubenswrapper[4885]: E0308 21:04:56.369513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:05 crc kubenswrapper[4885]: I0308 21:05:05.142494 4885 scope.go:117] "RemoveContainer" containerID="7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2" Mar 08 21:05:05 crc kubenswrapper[4885]: I0308 21:05:05.175657 4885 scope.go:117] "RemoveContainer" containerID="21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4" Mar 08 21:05:11 crc kubenswrapper[4885]: I0308 21:05:11.369047 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:11 crc kubenswrapper[4885]: E0308 21:05:11.370087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.890766 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:20 crc kubenswrapper[4885]: E0308 21:05:20.891611 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891626 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: E0308 21:05:20.891657 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="init" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="init" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891874 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.892503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.899733 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.964150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.964426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.972625 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.973641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.996537 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.073770 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.074818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.075952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.084138 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.085400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.093028 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.103518 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.105434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178082 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178276 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178335 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.193985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.245026 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.278474 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.279856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281400 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.282722 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.283502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.283891 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.289684 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.293616 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.299202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.319689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.384769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.384993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.396245 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.401883 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.417040 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.417289 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.421471 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.444754 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.487034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.487504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.508026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589896 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.607022 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.713687 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.722953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.748272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.863597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.940766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.984346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.996510 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: W0308 21:05:21.997403 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbafc9a8b_2cbe_465d_8055_e6c2675b80a4.slice/crio-bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642 WatchSource:0}: Error finding container bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642: Status 404 returned error can't find the container with id bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642 Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.047549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerStarted","Data":"bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.060204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerStarted","Data":"66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.061670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerStarted","Data":"c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.063052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerStarted","Data":"a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.064057 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerStarted","Data":"293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.064075 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerStarted","Data":"7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.273995 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-qgblt" podStartSLOduration=2.2739789 podStartE2EDuration="2.2739789s" podCreationTimestamp="2026-03-08 21:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:22.080729246 +0000 UTC m=+5623.476783469" watchObservedRunningTime="2026-03-08 21:05:22.2739789 +0000 UTC m=+5623.670032923" Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.280085 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:22 crc kubenswrapper[4885]: W0308 21:05:22.302812 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636cf333_497f_4fcf_9d2d_ebfe48c81d75.slice/crio-10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3 WatchSource:0}: Error finding container 10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3: Status 404 returned error can't find the container with id 10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.071577 4885 generic.go:334] "Generic (PLEG): container finished" podID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerID="08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.071624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerDied","Data":"08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.073676 4885 generic.go:334] "Generic (PLEG): container finished" podID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerID="3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.073749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerDied","Data":"3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.075351 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerID="2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.075397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerDied","Data":"2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076664 4885 generic.go:334] "Generic (PLEG): container finished" podID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerID="085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerDied","Data":"085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerStarted","Data":"10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.077885 4885 generic.go:334] "Generic (PLEG): container finished" podID="869febc8-e7d9-4723-bc87-567e08849a27" containerID="bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.077927 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerDied","Data":"bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.079187 4885 generic.go:334] "Generic (PLEG): container finished" podID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerID="293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.079224 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerDied","Data":"293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4"} Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.517064 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643103 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"9a6e4793-0be0-4d9f-b96a-c8877648415e\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"9a6e4793-0be0-4d9f-b96a-c8877648415e\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.644412 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a6e4793-0be0-4d9f-b96a-c8877648415e" (UID: "9a6e4793-0be0-4d9f-b96a-c8877648415e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.649389 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66" (OuterVolumeSpecName: "kube-api-access-kqk66") pod "9a6e4793-0be0-4d9f-b96a-c8877648415e" (UID: "9a6e4793-0be0-4d9f-b96a-c8877648415e"). InnerVolumeSpecName "kube-api-access-kqk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.649949 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.690854 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.698532 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.703881 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.745370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"869febc8-e7d9-4723-bc87-567e08849a27\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.745463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"869febc8-e7d9-4723-bc87-567e08849a27\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.747182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "869febc8-e7d9-4723-bc87-567e08849a27" (UID: "869febc8-e7d9-4723-bc87-567e08849a27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748164 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748196 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748211 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.754082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd" (OuterVolumeSpecName: "kube-api-access-56hkd") pod "869febc8-e7d9-4723-bc87-567e08849a27" (UID: "869febc8-e7d9-4723-bc87-567e08849a27"). InnerVolumeSpecName "kube-api-access-56hkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.848977 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849095 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849205 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849748 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7b81f14-560e-4a64-88c7-164fbb0b4f8b" (UID: "b7b81f14-560e-4a64-88c7-164fbb0b4f8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bafc9a8b-2cbe-465d-8055-e6c2675b80a4" (UID: "bafc9a8b-2cbe-465d-8055-e6c2675b80a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64ea00b6-97bd-459b-ad43-bbfc5862cc4c" (UID: "64ea00b6-97bd-459b-ad43-bbfc5862cc4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850858 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "636cf333-497f-4fcf-9d2d-ebfe48c81d75" (UID: "636cf333-497f-4fcf-9d2d-ebfe48c81d75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.852742 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t" (OuterVolumeSpecName: "kube-api-access-7v89t") pod "636cf333-497f-4fcf-9d2d-ebfe48c81d75" (UID: "636cf333-497f-4fcf-9d2d-ebfe48c81d75"). InnerVolumeSpecName "kube-api-access-7v89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.852897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2" (OuterVolumeSpecName: "kube-api-access-6hxp2") pod "64ea00b6-97bd-459b-ad43-bbfc5862cc4c" (UID: "64ea00b6-97bd-459b-ad43-bbfc5862cc4c"). InnerVolumeSpecName "kube-api-access-6hxp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.853238 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2" (OuterVolumeSpecName: "kube-api-access-bwxt2") pod "bafc9a8b-2cbe-465d-8055-e6c2675b80a4" (UID: "bafc9a8b-2cbe-465d-8055-e6c2675b80a4"). InnerVolumeSpecName "kube-api-access-bwxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.854232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7" (OuterVolumeSpecName: "kube-api-access-wdsk7") pod "b7b81f14-560e-4a64-88c7-164fbb0b4f8b" (UID: "b7b81f14-560e-4a64-88c7-164fbb0b4f8b"). InnerVolumeSpecName "kube-api-access-wdsk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951914 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951972 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951986 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952000 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952012 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952023 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952035 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952047 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerDied","Data":"66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103053 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103072 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115935 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerDied","Data":"c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115976 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115980 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerDied","Data":"10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121370 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerDied","Data":"a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124414 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124508 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.127554 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.128223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerDied","Data":"7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.128287 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132550 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerDied","Data":"bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132607 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132688 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.369313 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:25 crc kubenswrapper[4885]: E0308 21:05:25.369903 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.674600 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675197 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675209 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675226 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675238 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675245 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675253 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675266 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675271 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675286 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675292 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675452 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675463 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675472 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675486 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675495 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.676528 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.693706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700282 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700504 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-h5r8g" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700656 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.788948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789015 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.899583 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.899619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.900280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.908994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:27 crc kubenswrapper[4885]: I0308 21:05:27.013722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:27 crc kubenswrapper[4885]: I0308 21:05:27.468571 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:28 crc kubenswrapper[4885]: I0308 21:05:28.190059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerStarted","Data":"8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24"} Mar 08 21:05:28 crc kubenswrapper[4885]: I0308 21:05:28.190121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerStarted","Data":"2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9"} Mar 08 21:05:33 crc kubenswrapper[4885]: I0308 21:05:33.246069 4885 generic.go:334] "Generic (PLEG): container finished" podID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerID="8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24" exitCode=0 Mar 08 21:05:33 crc kubenswrapper[4885]: I0308 21:05:33.246216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerDied","Data":"8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24"} Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.661341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846340 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.852098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts" (OuterVolumeSpecName: "scripts") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.857331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w" (OuterVolumeSpecName: "kube-api-access-f2n4w") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "kube-api-access-f2n4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.873442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.881261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data" (OuterVolumeSpecName: "config-data") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948870 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948912 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948970 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948993 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.271877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerDied","Data":"2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9"} Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.271969 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.272016 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.400315 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:35 crc kubenswrapper[4885]: E0308 21:05:35.400901 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.400957 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.401276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.402238 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.415476 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.446092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-h5r8g" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.446972 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.672485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.673140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.673222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.678669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.679521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.698007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.764997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.020204 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:36 crc kubenswrapper[4885]: W0308 21:05:36.030215 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dff0b58_ac0f_4d39_9910_f924fff8f816.slice/crio-31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe WatchSource:0}: Error finding container 31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe: Status 404 returned error can't find the container with id 31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.285969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerStarted","Data":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.286037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerStarted","Data":"31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe"} Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.286461 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:39 crc kubenswrapper[4885]: I0308 21:05:39.377719 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:39 crc kubenswrapper[4885]: E0308 21:05:39.379484 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:45 crc kubenswrapper[4885]: I0308 21:05:45.809371 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:45 crc kubenswrapper[4885]: I0308 21:05:45.849898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.849867266 podStartE2EDuration="10.849867266s" podCreationTimestamp="2026-03-08 21:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:36.312298208 +0000 UTC m=+5637.708352231" watchObservedRunningTime="2026-03-08 21:05:45.849867266 +0000 UTC m=+5647.245921329" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.302912 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.306409 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.309548 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.309580 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324281 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.330418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.424107 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.425453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428387 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428531 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428667 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.433832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.434907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.436577 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.437317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.442852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.449622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.525424 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529447 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529669 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529743 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.536777 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.542349 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.543272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.550592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.560491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.567427 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.631382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.634737 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.649820 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.659317 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.675484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.680111 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.681463 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.733123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.803151 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.804619 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.817749 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.818750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.826393 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.827358 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.843560 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.844434 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.879960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.920362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949594 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949911 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.951029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.951884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.952202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.953034 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.963532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.966853 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.967483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.968381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.968960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.969099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.969533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.970092 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.035006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.172096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.214058 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.214719 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.340235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.396439 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.478147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerStarted","Data":"1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.479401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerStarted","Data":"4c375bccd82478c29984cb004edc82a922cd3f66f47be6d3e4038a2ff4cf6623"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.480270 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerStarted","Data":"5cf04ad54507a736672fee7faf37e9b754d2e6cb6f2076a1a5654158ee25a581"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.512859 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.598455 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.599854 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.603780 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.603828 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.613567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.664977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665289 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665343 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.751607 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766669 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.773791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.783452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.783849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.784454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: W0308 21:05:47.855715 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb854cb23_a7e2_4249_9d13_70599979ab86.slice/crio-ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa WatchSource:0}: Error finding container ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa: Status 404 returned error can't find the container with id ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.875235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.928243 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.412520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:48 crc kubenswrapper[4885]: W0308 21:05:48.417716 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed9605f_3b77_4800_9534_6d8f2654f392.slice/crio-81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893 WatchSource:0}: Error finding container 81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893: Status 404 returned error can't find the container with id 81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893 Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502564 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"3c2a0cba92f705d00f5c037b2bfdeb75772f15dbb417883adb8221811ddee0ac"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.504739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerStarted","Data":"57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.514825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerStarted","Data":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.514863 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerStarted","Data":"6877cb37b34c366b3176b008053204c71bb5733fa92d9537850aea7c85b6ca99"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.530301 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerID="364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b" exitCode=0 Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.530377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.537336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerStarted","Data":"81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.539554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerStarted","Data":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.544360 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.544343776 podStartE2EDuration="2.544343776s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.540667157 +0000 UTC m=+5649.936721190" watchObservedRunningTime="2026-03-08 21:05:48.544343776 +0000 UTC m=+5649.940397799" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.564400 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.56438232 podStartE2EDuration="2.56438232s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.562197862 +0000 UTC m=+5649.958251885" watchObservedRunningTime="2026-03-08 21:05:48.56438232 +0000 UTC m=+5649.960436343" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.672972 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vk668" podStartSLOduration=2.672947046 podStartE2EDuration="2.672947046s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.592278564 +0000 UTC m=+5649.988332587" watchObservedRunningTime="2026-03-08 21:05:48.672947046 +0000 UTC m=+5650.069001069" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.685230 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.685215624 podStartE2EDuration="2.685215624s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.633383841 +0000 UTC m=+5650.029437864" watchObservedRunningTime="2026-03-08 21:05:48.685215624 +0000 UTC m=+5650.081269637" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.703577 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.703556973 podStartE2EDuration="2.703556973s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.66446765 +0000 UTC m=+5650.060521673" watchObservedRunningTime="2026-03-08 21:05:48.703556973 +0000 UTC m=+5650.099610996" Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.590132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerStarted","Data":"9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef"} Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.618800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerStarted","Data":"53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f"} Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.626962 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" podStartSLOduration=2.626949272 podStartE2EDuration="2.626949272s" podCreationTimestamp="2026-03-08 21:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:49.617708076 +0000 UTC m=+5651.013762109" watchObservedRunningTime="2026-03-08 21:05:49.626949272 +0000 UTC m=+5651.023003295" Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.665386 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" podStartSLOduration=3.665369237 podStartE2EDuration="3.665369237s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:49.657342483 +0000 UTC m=+5651.053396506" watchObservedRunningTime="2026-03-08 21:05:49.665369237 +0000 UTC m=+5651.061423260" Mar 08 21:05:50 crc kubenswrapper[4885]: I0308 21:05:50.628369 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.639417 4885 generic.go:334] "Generic (PLEG): container finished" podID="eed9605f-3b77-4800-9534-6d8f2654f392" containerID="9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef" exitCode=0 Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.640363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerDied","Data":"9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef"} Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.827440 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.921269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.921356 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.215369 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.653439 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerID="57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60" exitCode=0 Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.653555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerDied","Data":"57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60"} Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.151076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292327 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.306028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts" (OuterVolumeSpecName: "scripts") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.306176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct" (OuterVolumeSpecName: "kube-api-access-2kqct") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "kube-api-access-2kqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.318824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data" (OuterVolumeSpecName: "config-data") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.347127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.395945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.395992 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.396013 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.396033 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672784 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerDied","Data":"81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893"} Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672864 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.805575 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:53 crc kubenswrapper[4885]: E0308 21:05:53.806034 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.806055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.806345 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.807110 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809378 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.810058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.829104 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910938 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.914600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.917739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.943150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.022123 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.114845 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.119133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh" (OuterVolumeSpecName: "kube-api-access-v8xxh") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "kube-api-access-v8xxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.120422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts" (OuterVolumeSpecName: "scripts") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.132906 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.141131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.157954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data" (OuterVolumeSpecName: "config-data") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.217526 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.217846 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.218065 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.218196 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.369096 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:54 crc kubenswrapper[4885]: E0308 21:05:54.369462 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.410021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.688834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerStarted","Data":"b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.689579 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.689601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerStarted","Data":"38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerDied","Data":"1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697556 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697605 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.715770 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.715748517 podStartE2EDuration="1.715748517s" podCreationTimestamp="2026-03-08 21:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:54.705464092 +0000 UTC m=+5656.101518125" watchObservedRunningTime="2026-03-08 21:05:54.715748517 +0000 UTC m=+5656.111802530" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.909454 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.909861 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" containerID="cri-o://df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" gracePeriod=30 Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.919810 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.920479 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" containerID="cri-o://62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" gracePeriod=30 Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.920749 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" containerID="cri-o://c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.001462 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.001718 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" containerID="cri-o://7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.002284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" containerID="cri-o://e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.668541 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.688226 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706715 4885 generic.go:334] "Generic (PLEG): container finished" podID="b854cb23-a7e2-4249-9d13-70599979ab86" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" exitCode=0 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706747 4885 generic.go:334] "Generic (PLEG): container finished" podID="b854cb23-a7e2-4249-9d13-70599979ab86" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" exitCode=143 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706757 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706880 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709140 4885 generic.go:334] "Generic (PLEG): container finished" podID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" exitCode=0 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709159 4885 generic.go:334] "Generic (PLEG): container finished" podID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" exitCode=143 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709204 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"3c2a0cba92f705d00f5c037b2bfdeb75772f15dbb417883adb8221811ddee0ac"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.733868 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.755236 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.757372 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.757419 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} err="failed to get container status \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.757448 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.758214 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} err="failed to get container status \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758336 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758743 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} err="failed to get container status \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758797 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759158 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759483 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759506 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.760198 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs" (OuterVolumeSpecName: "logs") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.760489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs" (OuterVolumeSpecName: "logs") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765072 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} err="failed to get container status \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765113 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5" (OuterVolumeSpecName: "kube-api-access-h92b5") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "kube-api-access-h92b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.767137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z" (OuterVolumeSpecName: "kube-api-access-pwr5z") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "kube-api-access-pwr5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.784300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.784396 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.785263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data" (OuterVolumeSpecName: "config-data") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.790826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data" (OuterVolumeSpecName: "config-data") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.792299 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.800874 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.801248 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801278 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} err="failed to get container status \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801299 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.801603 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801631 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} err="failed to get container status \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801648 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801917 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} err="failed to get container status \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801969 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.802257 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} err="failed to get container status \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862057 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862078 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862086 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862094 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862103 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862111 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862119 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862127 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.086854 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.100013 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.117819 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.123899 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.134864 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135355 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135375 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135394 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135403 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135427 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135435 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135460 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135468 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135480 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135488 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135704 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135719 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135730 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135741 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.136875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.139800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.158341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.159955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.162984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167729 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168124 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.178305 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.206524 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269608 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.270824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.271047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.275484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.300600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.301711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.461852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.495306 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: W0308 21:05:56.939936 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d41f915_4ddc_4f84_b402_67e3ff310d0e.slice/crio-20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb WatchSource:0}: Error finding container 20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb: Status 404 returned error can't find the container with id 20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.941831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.037223 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:57 crc kubenswrapper[4885]: W0308 21:05:57.038468 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a3be88_3b2d_4cd0_8987_443d67351acb.slice/crio-8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3 WatchSource:0}: Error finding container 8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3: Status 404 returned error can't find the container with id 8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.042244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.103252 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.103581 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" containerID="cri-o://4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" gracePeriod=10 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.215486 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.238281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.382081 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" path="/var/lib/kubelet/pods/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a/volumes" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.382707 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" path="/var/lib/kubelet/pods/b854cb23-a7e2-4249-9d13-70599979ab86/volumes" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.527178 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696241 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.706852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc" (OuterVolumeSpecName: "kube-api-access-7dkcc") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "kube-api-access-7dkcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.741603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.743422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.743817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config" (OuterVolumeSpecName: "config") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746107 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759319 4885 generic.go:334] "Generic (PLEG): container finished" podID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" exitCode=0 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759584 4885 scope.go:117] "RemoveContainer" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.775957 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.777594 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.777572785 podStartE2EDuration="1.777572785s" podCreationTimestamp="2026-03-08 21:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:57.76537411 +0000 UTC m=+5659.161428153" watchObservedRunningTime="2026-03-08 21:05:57.777572785 +0000 UTC m=+5659.173626808" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.785444 4885 scope.go:117] "RemoveContainer" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.790769 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7907468560000002 podStartE2EDuration="1.790746856s" podCreationTimestamp="2026-03-08 21:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:57.786561455 +0000 UTC m=+5659.182615488" watchObservedRunningTime="2026-03-08 21:05:57.790746856 +0000 UTC m=+5659.186800879" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798454 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798490 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798505 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798516 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798527 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848126 4885 scope.go:117] "RemoveContainer" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: E0308 21:05:57.848486 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": container with ID starting with 4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8 not found: ID does not exist" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848529 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} err="failed to get container status \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": rpc error: code = NotFound desc = could not find container \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": container with ID starting with 4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8 not found: ID does not exist" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848547 4885 scope.go:117] "RemoveContainer" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: E0308 21:05:57.848904 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": container with ID starting with 59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3 not found: ID does not exist" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848940 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3"} err="failed to get container status \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": rpc error: code = NotFound desc = could not find container \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": container with ID starting with 59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3 not found: ID does not exist" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.864814 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.871281 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.209439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.383414 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" path="/var/lib/kubelet/pods/a18bb9fd-f7a3-4935-9d89-26654d7e08c5/volumes" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.493889 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662565 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.674103 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd" (OuterVolumeSpecName: "kube-api-access-7n8vd") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "kube-api-access-7n8vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.701249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data" (OuterVolumeSpecName: "config-data") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.726137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.764591 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.765363 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.765434 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.781983 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.782763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="init" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.782848 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="init" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.782954 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.783149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783523 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783616 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.784766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.791379 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.796629 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.800545 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818416 4885 generic.go:334] "Generic (PLEG): container finished" podID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" exitCode=0 Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerDied","Data":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerDied","Data":"5cf04ad54507a736672fee7faf37e9b754d2e6cb6f2076a1a5654158ee25a581"} Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818533 4885 scope.go:117] "RemoveContainer" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818690 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.864830 4885 scope.go:117] "RemoveContainer" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.869103 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": container with ID starting with df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2 not found: ID does not exist" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.869149 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} err="failed to get container status \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": rpc error: code = NotFound desc = could not find container \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": container with ID starting with df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2 not found: ID does not exist" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.929999 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.953961 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.970957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971067 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.993206 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:05:59.994715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.005890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.020813 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072326 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.076310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.076872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.082448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.092765 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.134488 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.136425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.140235 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.141353 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.141430 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.163637 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.171159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173266 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173447 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276004 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276362 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.281240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.281264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.303274 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.353418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.377633 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.407577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.461101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.719729 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.735294 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: W0308 21:06:00.735415 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3405968f_173e_4ab2_a8ac_699fdaaad4d3.slice/crio-8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132 WatchSource:0}: Error finding container 8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132: Status 404 returned error can't find the container with id 8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132 Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.820380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.840834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerStarted","Data":"8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132"} Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.846185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerStarted","Data":"f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.400435 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" path="/var/lib/kubelet/pods/e08bdfc6-0196-4b23-b6a5-b0c947f646e7/volumes" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.462692 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.462884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.865585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerStarted","Data":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.865658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerStarted","Data":"fc7da7a57e0febad3191a8b8237c25b407f5b585b3c92502602c4310b7e751ee"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.868141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerStarted","Data":"8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.898498 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.898468433 podStartE2EDuration="2.898468433s" podCreationTimestamp="2026-03-08 21:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:01.888602499 +0000 UTC m=+5663.284656522" watchObservedRunningTime="2026-03-08 21:06:01.898468433 +0000 UTC m=+5663.294522456" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.924335 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8crmt" podStartSLOduration=2.924307781 podStartE2EDuration="2.924307781s" podCreationTimestamp="2026-03-08 21:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:01.908586212 +0000 UTC m=+5663.304640235" watchObservedRunningTime="2026-03-08 21:06:01.924307781 +0000 UTC m=+5663.320361834" Mar 08 21:06:02 crc kubenswrapper[4885]: I0308 21:06:02.880544 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerID="f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde" exitCode=0 Mar 08 21:06:02 crc kubenswrapper[4885]: I0308 21:06:02.880672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerDied","Data":"f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde"} Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.373008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.475711 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"ddbd1248-e534-4251-b5a6-0505b7710e6e\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.488847 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn" (OuterVolumeSpecName: "kube-api-access-d4whn") pod "ddbd1248-e534-4251-b5a6-0505b7710e6e" (UID: "ddbd1248-e534-4251-b5a6-0505b7710e6e"). InnerVolumeSpecName "kube-api-access-d4whn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.579166 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerDied","Data":"f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d"} Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912596 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912627 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.305650 4885 scope.go:117] "RemoveContainer" containerID="93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.353576 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.501852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.514980 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.927584 4885 generic.go:334] "Generic (PLEG): container finished" podID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerID="8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe" exitCode=0 Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.927674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerDied","Data":"8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe"} Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.463135 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.463610 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.495963 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.496062 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.358507 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.368243 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.392778 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" path="/var/lib/kubelet/pods/a0100b61-a97f-40b6-b8fd-91499667f3d9/volumes" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450416 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.457146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts" (OuterVolumeSpecName: "scripts") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.461042 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6" (OuterVolumeSpecName: "kube-api-access-66fz6") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "kube-api-access-66fz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.478851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.484044 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data" (OuterVolumeSpecName: "config-data") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552340 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552386 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552401 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552412 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628252 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628377 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628522 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628272 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.950806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerDied","Data":"8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132"} Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953142 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953246 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.197608 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.197856 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" containerID="cri-o://145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.198352 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" containerID="cri-o://756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.217832 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.218148 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" containerID="cri-o://3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231121 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231511 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" containerID="cri-o://1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231597 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" containerID="cri-o://cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.972868 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerID="145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" exitCode=143 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.973149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5"} Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.980595 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerID="1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" exitCode=143 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.980670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a"} Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.549738 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.673938 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.674583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.674687 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.682477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc" (OuterVolumeSpecName: "kube-api-access-6xxpc") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "kube-api-access-6xxpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.710101 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.727458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data" (OuterVolumeSpecName: "config-data") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777556 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777585 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.034077 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerID="cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.034137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.037311 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerID="756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.037386 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039347 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerDied","Data":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039452 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerDied","Data":"fc7da7a57e0febad3191a8b8237c25b407f5b585b3c92502602c4310b7e751ee"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039516 4885 scope.go:117] "RemoveContainer" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.109137 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.125118 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.127642 4885 scope.go:117] "RemoveContainer" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.130992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131498 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131517 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131546 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131555 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131573 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131583 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131855 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131873 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.132842 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.133224 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": container with ID starting with 3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14 not found: ID does not exist" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.133282 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} err="failed to get container status \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": rpc error: code = NotFound desc = could not find container \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": container with ID starting with 3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14 not found: ID does not exist" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.135691 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.137106 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.245376 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.251650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.290978 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.291023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.291246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.383256 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" path="/var/lib/kubelet/pods/d0878726-55a7-4b21-95ba-4dda1491dfdd/volumes" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392323 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392369 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.393007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.394365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs" (OuterVolumeSpecName: "logs") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.394611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs" (OuterVolumeSpecName: "logs") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.400424 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c" (OuterVolumeSpecName: "kube-api-access-sj25c") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "kube-api-access-sj25c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.400874 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz" (OuterVolumeSpecName: "kube-api-access-cfkcz") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "kube-api-access-cfkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.403188 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.413456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.417403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.427857 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data" (OuterVolumeSpecName: "config-data") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.428197 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.429482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data" (OuterVolumeSpecName: "config-data") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.440147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494356 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494390 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494403 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494417 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494429 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494441 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494453 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494464 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.539392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056140 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb"} Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056773 4885 scope.go:117] "RemoveContainer" containerID="cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.059275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3"} Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.059396 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: W0308 21:06:14.077375 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a29a091_3ebc_4dbb_b876_19892bedba02.slice/crio-3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e WatchSource:0}: Error finding container 3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e: Status 404 returned error can't find the container with id 3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.077456 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.113417 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.122054 4885 scope.go:117] "RemoveContainer" containerID="1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.134832 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.152801 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.169725 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.185048 4885 scope.go:117] "RemoveContainer" containerID="756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.190734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191225 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191240 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191267 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191276 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191290 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191299 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191318 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191327 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191533 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191551 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191566 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191585 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.192643 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.195347 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.202966 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.227992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.229868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.231520 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.237990 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.266058 4885 scope.go:117] "RemoveContainer" containerID="145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321154 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321446 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422683 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422818 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422897 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.423570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.426958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.429810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.447465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.524662 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.524894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.528570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.529164 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.547166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.567103 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.635459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.992064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.038898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.073148 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerStarted","Data":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.073202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerStarted","Data":"3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.075321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"ed98b0990237ad316a273718be6c6f8f3198828e148541a9840c7e6321b7e7da"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.076252 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"77ebc86a187e7f428857986b384dc697b96b7685acfe2d360f9674aa240afe23"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.096368 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.096351423 podStartE2EDuration="2.096351423s" podCreationTimestamp="2026-03-08 21:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:15.08952535 +0000 UTC m=+5676.485579383" watchObservedRunningTime="2026-03-08 21:06:15.096351423 +0000 UTC m=+5676.492405446" Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.381631 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" path="/var/lib/kubelet/pods/8d41f915-4ddc-4f84-b402-67e3ff310d0e/volumes" Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.383170 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" path="/var/lib/kubelet/pods/d8a3be88-3b2d-4cd0-8987-443d67351acb/volumes" Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.091082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.091151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.095801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.095866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.118475 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.118460096 podStartE2EDuration="2.118460096s" podCreationTimestamp="2026-03-08 21:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:16.113257777 +0000 UTC m=+5677.509311800" watchObservedRunningTime="2026-03-08 21:06:16.118460096 +0000 UTC m=+5677.514514109" Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.162585 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.162569582 podStartE2EDuration="2.162569582s" podCreationTimestamp="2026-03-08 21:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:16.157199429 +0000 UTC m=+5677.553253462" watchObservedRunningTime="2026-03-08 21:06:16.162569582 +0000 UTC m=+5677.558623605" Mar 08 21:06:18 crc kubenswrapper[4885]: I0308 21:06:18.540246 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:06:19 crc kubenswrapper[4885]: I0308 21:06:19.569024 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:19 crc kubenswrapper[4885]: I0308 21:06:19.569075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.607872 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.611290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.655147 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787195 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.827796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.969723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:22 crc kubenswrapper[4885]: I0308 21:06:22.471905 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.182544 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" exitCode=0 Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.182706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277"} Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.183877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"80e3a927b22816a9112bee41b9a53b04401331287a3f011a1dc86a4440d4689e"} Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.540640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.572196 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.196362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.261679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.569197 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.569267 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.636194 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.636265 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.206965 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" exitCode=0 Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.208420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.651522 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.651579 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.733233 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.733649 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:26 crc kubenswrapper[4885]: I0308 21:06:26.218507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} Mar 08 21:06:26 crc kubenswrapper[4885]: I0308 21:06:26.249241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5zrkw" podStartSLOduration=2.831514046 podStartE2EDuration="5.249222935s" podCreationTimestamp="2026-03-08 21:06:21 +0000 UTC" firstStartedPulling="2026-03-08 21:06:23.185078435 +0000 UTC m=+5684.581132458" lastFinishedPulling="2026-03-08 21:06:25.602787294 +0000 UTC m=+5686.998841347" observedRunningTime="2026-03-08 21:06:26.238776957 +0000 UTC m=+5687.634830980" watchObservedRunningTime="2026-03-08 21:06:26.249222935 +0000 UTC m=+5687.645276958" Mar 08 21:06:31 crc kubenswrapper[4885]: I0308 21:06:31.970797 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:31 crc kubenswrapper[4885]: I0308 21:06:31.971412 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:33 crc kubenswrapper[4885]: I0308 21:06:33.041702 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5zrkw" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" probeResult="failure" output=< Mar 08 21:06:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:06:33 crc kubenswrapper[4885]: > Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.570912 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.575975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.577335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.641103 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.641906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.642078 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.644658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.306720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.309647 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.311054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.581340 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.582974 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.597309 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856841 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858184 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.883972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.920465 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:36 crc kubenswrapper[4885]: I0308 21:06:36.484029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.336461 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerID="44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1" exitCode=0 Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.336576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1"} Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.337007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerStarted","Data":"13a5f96703742b15ab41ce5ca4bff51ff0ff5f629fdccca2879c9831c1547b90"} Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.348356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerStarted","Data":"750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8"} Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.348804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.373610 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" podStartSLOduration=3.373588332 podStartE2EDuration="3.373588332s" podCreationTimestamp="2026-03-08 21:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:38.367711015 +0000 UTC m=+5699.763765038" watchObservedRunningTime="2026-03-08 21:06:38.373588332 +0000 UTC m=+5699.769642355" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.041620 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.111441 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.281333 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:43 crc kubenswrapper[4885]: I0308 21:06:43.396689 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5zrkw" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" containerID="cri-o://32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" gracePeriod=2 Mar 08 21:06:43 crc kubenswrapper[4885]: I0308 21:06:43.985187 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.149700 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities" (OuterVolumeSpecName: "utilities") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.164273 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w" (OuterVolumeSpecName: "kube-api-access-bxp2w") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "kube-api-access-bxp2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.250709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.250748 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.292327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.352109 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410558 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" exitCode=0 Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"80e3a927b22816a9112bee41b9a53b04401331287a3f011a1dc86a4440d4689e"} Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410654 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410664 4885 scope.go:117] "RemoveContainer" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.438262 4885 scope.go:117] "RemoveContainer" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.452359 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.459778 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.478133 4885 scope.go:117] "RemoveContainer" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.536877 4885 scope.go:117] "RemoveContainer" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.537269 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": container with ID starting with 32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13 not found: ID does not exist" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537410 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} err="failed to get container status \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": rpc error: code = NotFound desc = could not find container \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": container with ID starting with 32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13 not found: ID does not exist" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537494 4885 scope.go:117] "RemoveContainer" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.537882 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": container with ID starting with 43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11 not found: ID does not exist" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537984 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} err="failed to get container status \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": rpc error: code = NotFound desc = could not find container \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": container with ID starting with 43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11 not found: ID does not exist" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.538056 4885 scope.go:117] "RemoveContainer" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.538548 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": container with ID starting with 4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277 not found: ID does not exist" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.538674 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277"} err="failed to get container status \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": rpc error: code = NotFound desc = could not find container \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": container with ID starting with 4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277 not found: ID does not exist" Mar 08 21:06:45 crc kubenswrapper[4885]: I0308 21:06:45.379054 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" path="/var/lib/kubelet/pods/2a0b0e8c-3002-4dcd-9172-998602ca9be9/volumes" Mar 08 21:06:45 crc kubenswrapper[4885]: I0308 21:06:45.922219 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.026301 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.026603 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" containerID="cri-o://53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" gracePeriod=10 Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.435944 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerID="53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" exitCode=0 Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.436145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f"} Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.517258 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589234 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.600629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9" (OuterVolumeSpecName: "kube-api-access-grkm9") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "kube-api-access-grkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.635034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.650891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.663687 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config" (OuterVolumeSpecName: "config") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.687667 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691356 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691386 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691397 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691409 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691420 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"4c375bccd82478c29984cb004edc82a922cd3f66f47be6d3e4038a2ff4cf6623"} Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447178 4885 scope.go:117] "RemoveContainer" containerID="53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.473626 4885 scope.go:117] "RemoveContainer" containerID="364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.477121 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.498044 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.350699 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351797 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-utilities" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351823 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-utilities" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351839 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-content" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351848 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-content" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351860 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351892 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.352409 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="init" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352423 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="init" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352629 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352652 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.353552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.362931 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.425392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.425494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.480015 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.481115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.495038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.499126 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.529558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.529607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.531262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.580693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.634698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.635010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.678966 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.736997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.737071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.737701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.754330 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.800394 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:49 crc kubenswrapper[4885]: W0308 21:06:49.113722 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96e29afc_72d1_4b29_9528_1ed61feed290.slice/crio-0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa WatchSource:0}: Error finding container 0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa: Status 404 returned error can't find the container with id 0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.119380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.253810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.382689 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" path="/var/lib/kubelet/pods/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b/volumes" Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.468439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerStarted","Data":"d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.468486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerStarted","Data":"2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.470769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerStarted","Data":"488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.470792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerStarted","Data":"0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.486948 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2e95-account-create-update-thqbx" podStartSLOduration=1.48692895 podStartE2EDuration="1.48692895s" podCreationTimestamp="2026-03-08 21:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:49.486314444 +0000 UTC m=+5710.882368467" watchObservedRunningTime="2026-03-08 21:06:49.48692895 +0000 UTC m=+5710.882982973" Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.508498 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cclgv" podStartSLOduration=1.508480355 podStartE2EDuration="1.508480355s" podCreationTimestamp="2026-03-08 21:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:49.505351652 +0000 UTC m=+5710.901405675" watchObservedRunningTime="2026-03-08 21:06:49.508480355 +0000 UTC m=+5710.904534378" Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.483599 4885 generic.go:334] "Generic (PLEG): container finished" podID="96e29afc-72d1-4b29-9528-1ed61feed290" containerID="488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e" exitCode=0 Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.483658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerDied","Data":"488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e"} Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.488299 4885 generic.go:334] "Generic (PLEG): container finished" podID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerID="d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9" exitCode=0 Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.488380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerDied","Data":"d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.069334 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.075126 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212077 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"96e29afc-72d1-4b29-9528-1ed61feed290\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"96e29afc-72d1-4b29-9528-1ed61feed290\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212441 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96e29afc-72d1-4b29-9528-1ed61feed290" (UID: "96e29afc-72d1-4b29-9528-1ed61feed290"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213329 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3159e4ac-64da-47ba-9c70-b23214e8b8ad" (UID: "3159e4ac-64da-47ba-9c70-b23214e8b8ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213787 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213824 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.219039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558" (OuterVolumeSpecName: "kube-api-access-vz558") pod "3159e4ac-64da-47ba-9c70-b23214e8b8ad" (UID: "3159e4ac-64da-47ba-9c70-b23214e8b8ad"). InnerVolumeSpecName "kube-api-access-vz558". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.224937 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8" (OuterVolumeSpecName: "kube-api-access-mtwh8") pod "96e29afc-72d1-4b29-9528-1ed61feed290" (UID: "96e29afc-72d1-4b29-9528-1ed61feed290"). InnerVolumeSpecName "kube-api-access-mtwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.316301 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.316353 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514684 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerDied","Data":"2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514835 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerDied","Data":"0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517080 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517099 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.806846 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:53 crc kubenswrapper[4885]: E0308 21:06:53.807642 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807658 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: E0308 21:06:53.807688 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807699 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807962 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807987 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.808738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.811342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.811991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rczmx" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.812383 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.817479 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944772 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.945102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.945170 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047024 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.048294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.048745 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.049280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.049375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.051654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.052190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.053020 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.053591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.067483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.145700 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.635325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.553819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerStarted","Data":"d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7"} Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.554345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerStarted","Data":"d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3"} Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.591898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4k8w9" podStartSLOduration=2.591871349 podStartE2EDuration="2.591871349s" podCreationTimestamp="2026-03-08 21:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:55.575320477 +0000 UTC m=+5716.971374550" watchObservedRunningTime="2026-03-08 21:06:55.591871349 +0000 UTC m=+5716.987925412" Mar 08 21:06:58 crc kubenswrapper[4885]: I0308 21:06:58.586079 4885 generic.go:334] "Generic (PLEG): container finished" podID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerID="d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7" exitCode=0 Mar 08 21:06:58 crc kubenswrapper[4885]: I0308 21:06:58.586204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerDied","Data":"d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7"} Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.002297 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081596 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.082071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.082576 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.086646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts" (OuterVolumeSpecName: "scripts") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.087535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr" (OuterVolumeSpecName: "kube-api-access-b5lxr") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "kube-api-access-b5lxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.087831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.108693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.125719 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data" (OuterVolumeSpecName: "config-data") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184897 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184952 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184966 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184975 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184983 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626371 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerDied","Data":"d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3"} Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.978869 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:00 crc kubenswrapper[4885]: E0308 21:07:00.979555 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.988912 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.989295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.990283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.994226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131900 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131965 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.132300 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.133610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.135454 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.135628 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.143033 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.147663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rczmx" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.150931 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.233934 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.233998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234027 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234051 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234077 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234236 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234637 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235019 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235362 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.253666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.309962 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337797 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.339682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.342217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.342776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.343212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.343307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.360379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.451764 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: W0308 21:07:01.867984 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod938eebde_2664_4ae3_8289_e378affb1274.slice/crio-52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9 WatchSource:0}: Error finding container 52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9: Status 404 returned error can't find the container with id 52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9 Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.869868 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.982218 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652275 4885 generic.go:334] "Generic (PLEG): container finished" podID="938eebde-2664-4ae3-8289-e378affb1274" containerID="12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc" exitCode=0 Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerStarted","Data":"52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.655336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.655365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"73a221bfa90434743d951101b28f8fc6f753d32cb93f954ef81be7dadb45dff1"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.664414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerStarted","Data":"196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.664800 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.667004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.667295 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.694845 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" podStartSLOduration=3.69482053 podStartE2EDuration="3.69482053s" podCreationTimestamp="2026-03-08 21:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:03.682939804 +0000 UTC m=+5725.078993827" watchObservedRunningTime="2026-03-08 21:07:03.69482053 +0000 UTC m=+5725.090874563" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.716198 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.716180511 podStartE2EDuration="2.716180511s" podCreationTimestamp="2026-03-08 21:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:03.70679983 +0000 UTC m=+5725.102853843" watchObservedRunningTime="2026-03-08 21:07:03.716180511 +0000 UTC m=+5725.112234534" Mar 08 21:07:05 crc kubenswrapper[4885]: I0308 21:07:05.528583 4885 scope.go:117] "RemoveContainer" containerID="c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5" Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.312150 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.407498 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.407892 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" containerID="cri-o://750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" gracePeriod=10 Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.782225 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerID="750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" exitCode=0 Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.782471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8"} Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.924810 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.119894 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.119985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120063 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120144 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.127898 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k" (OuterVolumeSpecName: "kube-api-access-pdc7k") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "kube-api-access-pdc7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.171498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.172669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config" (OuterVolumeSpecName: "config") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.178990 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.210086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222592 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222627 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222637 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222647 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222656 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.723844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.724325 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" containerID="cri-o://9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.724461 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" containerID="cri-o://0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.744548 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.744781 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" containerID="cri-o://6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.755791 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.756060 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" containerID="cri-o://d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.756134 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" containerID="cri-o://2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.768988 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.769193 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"13a5f96703742b15ab41ce5ca4bff51ff0ff5f629fdccca2879c9831c1547b90"} Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806516 4885 scope.go:117] "RemoveContainer" containerID="750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806625 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.814125 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.814312 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.827395 4885 scope.go:117] "RemoveContainer" containerID="44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.885453 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.894294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.386005 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" path="/var/lib/kubelet/pods/2ed9193c-0c46-47cb-af24-c8415837c19b/volumes" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.401401 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.488630 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.545819 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.551675 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.552968 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.553011 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644343 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644439 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.666489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd" (OuterVolumeSpecName: "kube-api-access-2dsgd") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "kube-api-access-2dsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.668534 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.668566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data" (OuterVolumeSpecName: "config-data") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.746328 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.748298 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.748364 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.817066 4885 generic.go:334] "Generic (PLEG): container finished" podID="994b00da-2d97-4508-8f36-b517afab98e1" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" exitCode=143 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.817489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.820776 4885 generic.go:334] "Generic (PLEG): container finished" podID="3195111b-b266-425b-82da-98f3d0a29f0e" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" exitCode=143 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.820847 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822609 4885 generic.go:334] "Generic (PLEG): container finished" podID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" exitCode=0 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerDied","Data":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerDied","Data":"6877cb37b34c366b3176b008053204c71bb5733fa92d9537850aea7c85b6ca99"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822665 4885 scope.go:117] "RemoveContainer" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822745 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.856862 4885 scope.go:117] "RemoveContainer" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.857284 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": container with ID starting with 2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59 not found: ID does not exist" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.857310 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} err="failed to get container status \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": rpc error: code = NotFound desc = could not find container \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": container with ID starting with 2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59 not found: ID does not exist" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.877047 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.886218 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.899642 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900043 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900059 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900083 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="init" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900089 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="init" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900100 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900132 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900301 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900311 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.903079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.918848 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054953 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.058838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.074007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.082479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.226478 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.697227 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.787003 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b" (OuterVolumeSpecName: "kube-api-access-8bp6b") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "kube-api-access-8bp6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.808398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data" (OuterVolumeSpecName: "config-data") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.812419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851672 4885 generic.go:334] "Generic (PLEG): container finished" podID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" exitCode=0 Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851777 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerDied","Data":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerDied","Data":"31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe"} Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851845 4885 scope.go:117] "RemoveContainer" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.852104 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.864785 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889228 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889285 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889311 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.890535 4885 scope.go:117] "RemoveContainer" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: E0308 21:07:14.891146 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": container with ID starting with 8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647 not found: ID does not exist" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.891300 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} err="failed to get container status \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": rpc error: code = NotFound desc = could not find container \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": container with ID starting with 8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647 not found: ID does not exist" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.898204 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.908287 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.917713 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: E0308 21:07:14.918399 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.918592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.919023 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.921898 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.924833 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.950415 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997581 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099185 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.105246 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.105400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.117129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.247871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.385460 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" path="/var/lib/kubelet/pods/1dff0b58-ac0f-4d39-9910-f924fff8f816/volumes" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.386579 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" path="/var/lib/kubelet/pods/7213b9f1-1c28-4e32-b68b-8f7464f38de0/volumes" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.775626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.870910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b55b7c8a-8888-43e1-a593-d3a1f00cba4c","Type":"ContainerStarted","Data":"922183f45030e12d00e8b807707859ba7744efe0c6e294484c02f0e7e3f3c408"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.871000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b55b7c8a-8888-43e1-a593-d3a1f00cba4c","Type":"ContainerStarted","Data":"65befdca19bb181cc8ab8864bc03f2df7d05201e6bf238aa9c4fd5e61334e69c"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.874044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerStarted","Data":"80d76197d607834e3f09a7896979af3b1a5308464484d654754ae837dd80a0ab"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.905152 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.905103398 podStartE2EDuration="2.905103398s" podCreationTimestamp="2026-03-08 21:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:15.885175556 +0000 UTC m=+5737.281229629" watchObservedRunningTime="2026-03-08 21:07:15.905103398 +0000 UTC m=+5737.301157451" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.910814 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": read tcp 10.217.0.2:59796->10.217.1.116:8775: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.911144 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": read tcp 10.217.0.2:59798->10.217.1.116:8775: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.933899 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": read tcp 10.217.0.2:45742->10.217.1.117:8774: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.934387 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": read tcp 10.217.0.2:45744->10.217.1.117:8774: read: connection reset by peer" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.010518 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.010709 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" gracePeriod=30 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.313372 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.378513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427233 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427570 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.428656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs" (OuterVolumeSpecName: "logs") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.431237 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs" (OuterVolumeSpecName: "logs") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.442705 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d" (OuterVolumeSpecName: "kube-api-access-74c4d") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "kube-api-access-74c4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.448973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf" (OuterVolumeSpecName: "kube-api-access-vpccf") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "kube-api-access-vpccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.472331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.480799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data" (OuterVolumeSpecName: "config-data") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.482530 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.485176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data" (OuterVolumeSpecName: "config-data") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530129 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530157 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530168 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530180 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530188 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530200 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530208 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530231 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883383 4885 generic.go:334] "Generic (PLEG): container finished" podID="994b00da-2d97-4508-8f36-b517afab98e1" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" exitCode=0 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"ed98b0990237ad316a273718be6c6f8f3198828e148541a9840c7e6321b7e7da"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883472 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883520 4885 scope.go:117] "RemoveContainer" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886502 4885 generic.go:334] "Generic (PLEG): container finished" podID="3195111b-b266-425b-82da-98f3d0a29f0e" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" exitCode=0 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"77ebc86a187e7f428857986b384dc697b96b7685acfe2d360f9674aa240afe23"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886693 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.915248 4885 scope.go:117] "RemoveContainer" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.921189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerStarted","Data":"3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.921248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.946220 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.966984 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.979044 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.980297 4885 scope.go:117] "RemoveContainer" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: E0308 21:07:16.986261 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": container with ID starting with 0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49 not found: ID does not exist" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.986310 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} err="failed to get container status \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": rpc error: code = NotFound desc = could not find container \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": container with ID starting with 0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49 not found: ID does not exist" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.986333 4885 scope.go:117] "RemoveContainer" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.994751 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: E0308 21:07:16.995050 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": container with ID starting with 9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b not found: ID does not exist" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.995094 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} err="failed to get container status \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": rpc error: code = NotFound desc = could not find container \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": container with ID starting with 9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b not found: ID does not exist" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.995116 4885 scope.go:117] "RemoveContainer" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.021159 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.021143956 podStartE2EDuration="3.021143956s" podCreationTimestamp="2026-03-08 21:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:16.972272222 +0000 UTC m=+5738.368326245" watchObservedRunningTime="2026-03-08 21:07:17.021143956 +0000 UTC m=+5738.417197979" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026631 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026642 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026657 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026664 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026674 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026691 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026697 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026861 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026880 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026892 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026899 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.052946 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.056377 4885 scope.go:117] "RemoveContainer" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.067990 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.091212 4885 scope.go:117] "RemoveContainer" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.093427 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": container with ID starting with 2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895 not found: ID does not exist" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.093553 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} err="failed to get container status \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": rpc error: code = NotFound desc = could not find container \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": container with ID starting with 2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895 not found: ID does not exist" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.093628 4885 scope.go:117] "RemoveContainer" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.098451 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": container with ID starting with d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324 not found: ID does not exist" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.098478 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} err="failed to get container status \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": rpc error: code = NotFound desc = could not find container \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": container with ID starting with d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324 not found: ID does not exist" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.109689 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.119017 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.120549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.122453 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.126851 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154275 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256412 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256438 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256581 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.257053 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.261434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.265774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.294632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.359387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.361247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.365443 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.379123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.381238 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" path="/var/lib/kubelet/pods/3195111b-b266-425b-82da-98f3d0a29f0e/volumes" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.381981 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994b00da-2d97-4508-8f36-b517afab98e1" path="/var/lib/kubelet/pods/994b00da-2d97-4508-8f36-b517afab98e1/volumes" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.394605 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.435705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.929764 4885 generic.go:334] "Generic (PLEG): container finished" podID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerID="b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" exitCode=0 Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.929993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerDied","Data":"b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2"} Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.930199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerDied","Data":"38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0"} Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.930224 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.941360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.012499 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.024442 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.201779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.202484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.202648 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.205964 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn" (OuterVolumeSpecName: "kube-api-access-d7jsn") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "kube-api-access-d7jsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.233162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.256980 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data" (OuterVolumeSpecName: "config-data") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304687 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304722 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304732 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.542837 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.548904 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.549906 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.549989 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951320 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"e11dd41c5f061aeb15ad1b8571823004739345378aecc51a3efc7e9384ef84f7"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970525 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"d6b2799aab2f7b019fc1cf13b4754da3de421e26bd6b42741c010f54fac4b62f"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.971835 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.001692 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.001668103 podStartE2EDuration="3.001668103s" podCreationTimestamp="2026-03-08 21:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:18.992784976 +0000 UTC m=+5740.388839019" watchObservedRunningTime="2026-03-08 21:07:19.001668103 +0000 UTC m=+5740.397722136" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.018137 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.018107962 podStartE2EDuration="3.018107962s" podCreationTimestamp="2026-03-08 21:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:19.0101548 +0000 UTC m=+5740.406208843" watchObservedRunningTime="2026-03-08 21:07:19.018107962 +0000 UTC m=+5740.414161995" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.049090 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.067939 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.079372 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: E0308 21:07:19.079973 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.080004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.080280 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.081191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.087014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.106010 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.117908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.118150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.118279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.224754 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.227417 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.231121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.240893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.380981 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" path="/var/lib/kubelet/pods/69024278-2c5f-4862-ac44-e04663a0c4a5/volumes" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.420738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: W0308 21:07:19.919383 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d9d37c_0204_47e9_956d_d93f2dd1e94d.slice/crio-4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4 WatchSource:0}: Error finding container 4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4: Status 404 returned error can't find the container with id 4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4 Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.924152 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.992012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerStarted","Data":"4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4"} Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.004165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerStarted","Data":"a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1"} Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.005572 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.046444 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.046426583 podStartE2EDuration="2.046426583s" podCreationTimestamp="2026-03-08 21:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:21.034127365 +0000 UTC m=+5742.430181388" watchObservedRunningTime="2026-03-08 21:07:21.046426583 +0000 UTC m=+5742.442480606" Mar 08 21:07:22 crc kubenswrapper[4885]: E0308 21:07:22.381712 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a29a091_3ebc_4dbb_b876_19892bedba02.slice/crio-conmon-6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.396456 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.396890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.829449 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993793 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.009030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d" (OuterVolumeSpecName: "kube-api-access-5sf6d") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "kube-api-access-5sf6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.021890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027202 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" exitCode=0 Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerDied","Data":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerDied","Data":"3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e"} Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027491 4885 scope.go:117] "RemoveContainer" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.034319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data" (OuterVolumeSpecName: "config-data") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.083184 4885 scope.go:117] "RemoveContainer" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: E0308 21:07:23.083665 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": container with ID starting with 6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd not found: ID does not exist" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.083702 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} err="failed to get container status \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": rpc error: code = NotFound desc = could not find container \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": container with ID starting with 6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd not found: ID does not exist" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096251 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096293 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096313 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.410304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.437009 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.446341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: E0308 21:07:23.446941 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.446965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.447203 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.448515 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.451544 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.459099 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.610785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.611000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.611083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713405 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.717551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.725612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.743267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.764191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.227114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.242700 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:24 crc kubenswrapper[4885]: W0308 21:07:24.331110 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5c41752_6a6f_4bbf_882f_a1e873cd225f.slice/crio-ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1 WatchSource:0}: Error finding container ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1: Status 404 returned error can't find the container with id ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1 Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.333646 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.062240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerStarted","Data":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.062749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerStarted","Data":"ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1"} Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.077816 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.096519 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.096503522 podStartE2EDuration="2.096503522s" podCreationTimestamp="2026-03-08 21:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:25.085754815 +0000 UTC m=+5746.481808838" watchObservedRunningTime="2026-03-08 21:07:25.096503522 +0000 UTC m=+5746.492557545" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.284431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.376743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" path="/var/lib/kubelet/pods/6a29a091-3ebc-4dbb-b876-19892bedba02/volumes" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.395690 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.396008 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.437202 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.437255 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.436226 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561187 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.128:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561302 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561392 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.128:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.764611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:07:29 crc kubenswrapper[4885]: I0308 21:07:29.470938 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.044249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.046963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.050018 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.075322 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.160524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161944 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.264735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.266489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.273368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.274373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.275223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.276041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.293315 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.378547 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.864675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: W0308 21:07:31.903148 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13978f90_1bb0_4f18_9094_6bbbafc7dd21.slice/crio-0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731 WatchSource:0}: Error finding container 0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731: Status 404 returned error can't find the container with id 0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.140677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731"} Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300645 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300912 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" containerID="cri-o://ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" gracePeriod=30 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300996 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" containerID="cri-o://8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" gracePeriod=30 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.627407 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.629177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.631592 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.674337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.791869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.791994 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792282 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792485 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792618 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792658 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895545 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895764 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895890 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895979 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896371 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896687 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896773 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896904 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.897332 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.900104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.901054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.901200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.902596 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.910608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.914917 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.988045 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.153946 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.153998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.176807 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" exitCode=143 Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.177667 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.187906 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.187883926 podStartE2EDuration="2.187883926s" podCreationTimestamp="2026-03-08 21:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:33.183576341 +0000 UTC m=+5754.579630354" watchObservedRunningTime="2026-03-08 21:07:33.187883926 +0000 UTC m=+5754.583937959" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.255674 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.275485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.275606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.280856 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.383486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405558 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405597 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405687 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405741 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405953 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405980 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406093 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508097 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508120 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508155 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508254 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508271 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508377 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.509365 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.509400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.510057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.510720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.512047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.515358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.515461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.518734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.519364 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.520018 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.524221 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.642597 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.765006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.802217 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.201032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"e6925dbb4d4c2d2645ba7fe3b89359e6a91429076db5f03f49cb0e1ef3e714bb"} Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.260287 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.340271 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:34 crc kubenswrapper[4885]: W0308 21:07:34.458064 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eb198a5_6241_48b0_bc8c_57ad764a1f3b.slice/crio-73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a WatchSource:0}: Error finding container 73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a: Status 404 returned error can't find the container with id 73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.213766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"f65dd3c1291428b7f45a33e87497faa12831a7ebb83ceff464a7e0aa5ae65bb1"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.215604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"19d24ae5f67c03e1da8f4df4964cafb82b3621b9126e14acec3d80f16e854646"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.219556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.260345 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.132550043 podStartE2EDuration="3.260320744s" podCreationTimestamp="2026-03-08 21:07:32 +0000 UTC" firstStartedPulling="2026-03-08 21:07:33.377973376 +0000 UTC m=+5754.774027399" lastFinishedPulling="2026-03-08 21:07:34.505744047 +0000 UTC m=+5755.901798100" observedRunningTime="2026-03-08 21:07:35.250433201 +0000 UTC m=+5756.646487224" watchObservedRunningTime="2026-03-08 21:07:35.260320744 +0000 UTC m=+5756.656374767" Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.934234 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.064289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.065019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs" (OuterVolumeSpecName: "logs") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.065714 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.069401 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts" (OuterVolumeSpecName: "scripts") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.069502 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.071104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn" (OuterVolumeSpecName: "kube-api-access-rftcn") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "kube-api-access-rftcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.100079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.111227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data" (OuterVolumeSpecName: "config-data") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168480 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168521 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168534 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168547 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168562 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168574 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.231514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"5602fc306631a848e72138dff41bdfb61a61f06959859effd5f35781d85b851a"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.231561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"d68b540453dd58e109792497aa02a9665490650e6b7a8937a0a76f89af9ce4fd"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.233694 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" exitCode=0 Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.234452 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"73a221bfa90434743d951101b28f8fc6f753d32cb93f954ef81be7dadb45dff1"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235728 4885 scope.go:117] "RemoveContainer" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.276902 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.586009922 podStartE2EDuration="3.276885239s" podCreationTimestamp="2026-03-08 21:07:33 +0000 UTC" firstStartedPulling="2026-03-08 21:07:34.460991874 +0000 UTC m=+5755.857045907" lastFinishedPulling="2026-03-08 21:07:35.151867191 +0000 UTC m=+5756.547921224" observedRunningTime="2026-03-08 21:07:36.25813657 +0000 UTC m=+5757.654190623" watchObservedRunningTime="2026-03-08 21:07:36.276885239 +0000 UTC m=+5757.672939252" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.281025 4885 scope.go:117] "RemoveContainer" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.291008 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.299403 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.315475 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.316667 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.316686 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.316711 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.316720 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.317271 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.317295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.319887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.322304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.335757 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350002 4885 scope.go:117] "RemoveContainer" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.350716 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": container with ID starting with 8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83 not found: ID does not exist" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350753 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} err="failed to get container status \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": rpc error: code = NotFound desc = could not find container \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": container with ID starting with 8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83 not found: ID does not exist" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350780 4885 scope.go:117] "RemoveContainer" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.351847 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": container with ID starting with ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c not found: ID does not exist" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.351869 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} err="failed to get container status \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": rpc error: code = NotFound desc = could not find container \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": container with ID starting with ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c not found: ID does not exist" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.378624 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481144 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481291 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481343 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583125 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583332 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.585291 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589245 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589957 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.604855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.610163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.658254 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.170950 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.259879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"26eadba5e089e69115d1c5ae218bbe8e990ab92af7d65ef8c3353b5cb9e20b6a"} Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.379991 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" path="/var/lib/kubelet/pods/a4a0f209-cabb-4b78-8e14-17625407e49d/volumes" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.397859 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.399963 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.402838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.443751 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.444160 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.448206 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.448375 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.988910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.272190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"bd524370eea870813a7c3754ca1646805cc89fdc3e8f0f15f92f53813e1b0f05"} Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.272636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.277456 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.277602 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.643729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.289457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"de854a5b5ca38c81d341cd7433e27205131779dbdd929e60f852088e1e045149"} Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.290107 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.327370 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.327338955 podStartE2EDuration="3.327338955s" podCreationTimestamp="2026-03-08 21:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:39.319455705 +0000 UTC m=+5760.715509738" watchObservedRunningTime="2026-03-08 21:07:39.327338955 +0000 UTC m=+5760.723393008" Mar 08 21:07:41 crc kubenswrapper[4885]: I0308 21:07:41.663275 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 21:07:41 crc kubenswrapper[4885]: I0308 21:07:41.766798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:42 crc kubenswrapper[4885]: I0308 21:07:42.349003 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" containerID="cri-o://1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" gracePeriod=30 Mar 08 21:07:42 crc kubenswrapper[4885]: I0308 21:07:42.349088 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" containerID="cri-o://9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" gracePeriod=30 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.167722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362545 4885 generic.go:334] "Generic (PLEG): container finished" podID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerID="9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" exitCode=0 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362595 4885 generic.go:334] "Generic (PLEG): container finished" podID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerID="1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" exitCode=0 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b"} Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2"} Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.906091 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.916700 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.056950 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057975 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.063827 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts" (OuterVolumeSpecName: "scripts") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.067616 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.075359 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv" (OuterVolumeSpecName: "kube-api-access-9txnv") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "kube-api-access-9txnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.120474 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159477 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159797 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159887 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159993 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.165108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data" (OuterVolumeSpecName: "config-data") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.261943 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.394835 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731"} Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.394905 4885 scope.go:117] "RemoveContainer" containerID="9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.395136 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.441551 4885 scope.go:117] "RemoveContainer" containerID="1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.449414 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.481104 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.501430 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: E0308 21:07:44.501955 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.501978 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: E0308 21:07:44.501998 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502266 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.507000 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.509816 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.533415 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678686 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678909 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.679122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780880 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.785287 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.786228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.793811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.794408 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.810729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.829410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.340696 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:45 crc kubenswrapper[4885]: W0308 21:07:45.349334 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7620c463_ffe0_4d70_ba82_deaef34da248.slice/crio-756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc WatchSource:0}: Error finding container 756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc: Status 404 returned error can't find the container with id 756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.381134 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" path="/var/lib/kubelet/pods/13978f90-1bb0-4f18-9094-6bbbafc7dd21/volumes" Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.415304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc"} Mar 08 21:07:46 crc kubenswrapper[4885]: I0308 21:07:46.432769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"bba7159d14310c79ddeaa2597e896a5f6fe510b649b0b9206814fe055de3ce19"} Mar 08 21:07:47 crc kubenswrapper[4885]: I0308 21:07:47.454135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"2b0ded14f6da19f693e60f9e596950aa09eff6442661efedfd0b38eff3938cd9"} Mar 08 21:07:47 crc kubenswrapper[4885]: I0308 21:07:47.486586 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.486565668 podStartE2EDuration="3.486565668s" podCreationTimestamp="2026-03-08 21:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:47.482029817 +0000 UTC m=+5768.878083850" watchObservedRunningTime="2026-03-08 21:07:47.486565668 +0000 UTC m=+5768.882619701" Mar 08 21:07:48 crc kubenswrapper[4885]: I0308 21:07:48.450484 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 21:07:49 crc kubenswrapper[4885]: I0308 21:07:49.829797 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 21:07:55 crc kubenswrapper[4885]: I0308 21:07:55.114894 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.154062 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.156705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.160234 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.160301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.161773 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.170900 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.301464 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.403943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.435903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.490636 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.994364 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:01 crc kubenswrapper[4885]: I0308 21:08:01.651677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerStarted","Data":"c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae"} Mar 08 21:08:02 crc kubenswrapper[4885]: I0308 21:08:02.675160 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerStarted","Data":"dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a"} Mar 08 21:08:02 crc kubenswrapper[4885]: I0308 21:08:02.696903 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550068-8fldh" podStartSLOduration=1.563059212 podStartE2EDuration="2.696882095s" podCreationTimestamp="2026-03-08 21:08:00 +0000 UTC" firstStartedPulling="2026-03-08 21:08:00.999610964 +0000 UTC m=+5782.395665027" lastFinishedPulling="2026-03-08 21:08:02.133433867 +0000 UTC m=+5783.529487910" observedRunningTime="2026-03-08 21:08:02.692228131 +0000 UTC m=+5784.088282154" watchObservedRunningTime="2026-03-08 21:08:02.696882095 +0000 UTC m=+5784.092936128" Mar 08 21:08:03 crc kubenswrapper[4885]: I0308 21:08:03.688662 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerID="dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a" exitCode=0 Mar 08 21:08:03 crc kubenswrapper[4885]: I0308 21:08:03.688756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerDied","Data":"dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a"} Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.066955 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.207853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"2a1f5d5d-f061-4187-a9ed-720b291774e5\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.221212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg" (OuterVolumeSpecName: "kube-api-access-5tkxg") pod "2a1f5d5d-f061-4187-a9ed-720b291774e5" (UID: "2a1f5d5d-f061-4187-a9ed-720b291774e5"). InnerVolumeSpecName "kube-api-access-5tkxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.310305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") on node \"crc\" DevicePath \"\"" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerDied","Data":"c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae"} Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712841 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712870 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.778389 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.787197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:08:07 crc kubenswrapper[4885]: I0308 21:08:07.385511 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" path="/var/lib/kubelet/pods/86555f65-5ef4-4c45-9ac3-9b561d985b57/volumes" Mar 08 21:08:32 crc kubenswrapper[4885]: I0308 21:08:32.818687 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:08:32 crc kubenswrapper[4885]: I0308 21:08:32.819571 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:02 crc kubenswrapper[4885]: I0308 21:09:02.818102 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:09:02 crc kubenswrapper[4885]: I0308 21:09:02.820105 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.008032 4885 scope.go:117] "RemoveContainer" containerID="f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.082007 4885 scope.go:117] "RemoveContainer" containerID="a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.105664 4885 scope.go:117] "RemoveContainer" containerID="22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56" Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.050376 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.059022 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.066896 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.073670 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 21:09:27 crc kubenswrapper[4885]: I0308 21:09:27.405632 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" path="/var/lib/kubelet/pods/2af0fb78-1571-4090-a0e4-009deb2915a5/volumes" Mar 08 21:09:27 crc kubenswrapper[4885]: I0308 21:09:27.406631 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" path="/var/lib/kubelet/pods/dadcfb24-7e2e-42d4-b4da-4567105c11ad/volumes" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.433207 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:28 crc kubenswrapper[4885]: E0308 21:09:28.433808 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.433829 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.434215 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.436498 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.451295 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715468 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715541 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.716178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.716775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.755593 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.772532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.276517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741477 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" exitCode=0 Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1"} Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"84590b4da479b1b889ba9e8ed38b36c22bd738109e3b3292bd5c830ce0a9b1f6"} Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.744488 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:09:30 crc kubenswrapper[4885]: I0308 21:09:30.754276 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} Mar 08 21:09:31 crc kubenswrapper[4885]: I0308 21:09:31.770692 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" exitCode=0 Mar 08 21:09:31 crc kubenswrapper[4885]: I0308 21:09:31.770790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.646476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.647954 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.650176 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.652089 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kwwpv" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.666127 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.668473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.693592 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.711001 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.787599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796084 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.806573 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7crp" podStartSLOduration=2.355334176 podStartE2EDuration="4.806554998s" podCreationTimestamp="2026-03-08 21:09:28 +0000 UTC" firstStartedPulling="2026-03-08 21:09:29.744202895 +0000 UTC m=+5871.140256928" lastFinishedPulling="2026-03-08 21:09:32.195423687 +0000 UTC m=+5873.591477750" observedRunningTime="2026-03-08 21:09:32.800663131 +0000 UTC m=+5874.196717154" watchObservedRunningTime="2026-03-08 21:09:32.806554998 +0000 UTC m=+5874.202609021" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.817998 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818059 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818771 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818827 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" gracePeriod=600 Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897968 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898068 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898182 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898723 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.899022 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.900132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.901804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.920376 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.920500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.963032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.992260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.058026 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.069507 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.379491 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" path="/var/lib/kubelet/pods/a21d9a63-6439-41e2-915d-9ffa3d014a30/volumes" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.447289 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.803028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft" event={"ID":"00348ab8-7686-4e8d-bada-3d9e32edca19","Type":"ContainerStarted","Data":"0254d6259023a959e1f163389eb4a2580a3f8bc33c2bc4bb81ff66bfc41e343d"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806366 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" exitCode=0 Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806765 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806964 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.910591 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:33 crc kubenswrapper[4885]: W0308 21:09:33.921549 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9fbe86b_d12b_4122_93b5_4cd373fca82b.slice/crio-9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e WatchSource:0}: Error finding container 9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e: Status 404 returned error can't find the container with id 9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.210496 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.211957 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.214769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.227885 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433216 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433218 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.452672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.543114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817222 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9fbe86b-d12b-4122-93b5-4cd373fca82b" containerID="5ce551c19a49288b39bac620717716e0b01f78076da4b4df87c526bdf4dbf80b" exitCode=0 Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerDied","Data":"5ce551c19a49288b39bac620717716e0b01f78076da4b4df87c526bdf4dbf80b"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817612 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.822543 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft" event={"ID":"00348ab8-7686-4e8d-bada-3d9e32edca19","Type":"ContainerStarted","Data":"68e12085e6f3582286197eb951027095349dfadab80a051924dc07c9275dd729"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.822613 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.863729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5jmft" podStartSLOduration=2.863711219 podStartE2EDuration="2.863711219s" podCreationTimestamp="2026-03-08 21:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:34.855285694 +0000 UTC m=+5876.251339717" watchObservedRunningTime="2026-03-08 21:09:34.863711219 +0000 UTC m=+5876.259765242" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.012176 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:35 crc kubenswrapper[4885]: W0308 21:09:35.039240 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db23198_8297_4e77_aed3_78ca89d5e6f8.slice/crio-d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da WatchSource:0}: Error finding container d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da: Status 404 returned error can't find the container with id d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.840803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzmvz" event={"ID":"2db23198-8297-4e77-aed3-78ca89d5e6f8","Type":"ContainerStarted","Data":"6930581ae0fe282030d39ffb7d4844bfa5f1e0254849a14c42e00f74481c136d"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.841783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzmvz" event={"ID":"2db23198-8297-4e77-aed3-78ca89d5e6f8","Type":"ContainerStarted","Data":"d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844144 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"32de8c9ba548f56a7ef576caa8ec0e4d92ea98d64cd06b8e68fb4b8b001e1f76"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"c2e033f0ed41790128c8a1c9bef8266f92b56cf1c5071cf6bd8265b11e1129d6"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844475 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.865607 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rzmvz" podStartSLOduration=1.865583922 podStartE2EDuration="1.865583922s" podCreationTimestamp="2026-03-08 21:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:35.858946085 +0000 UTC m=+5877.255000108" watchObservedRunningTime="2026-03-08 21:09:35.865583922 +0000 UTC m=+5877.261637965" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.905334 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-b6j88" podStartSLOduration=3.905308372 podStartE2EDuration="3.905308372s" podCreationTimestamp="2026-03-08 21:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:35.890912778 +0000 UTC m=+5877.286966801" watchObservedRunningTime="2026-03-08 21:09:35.905308372 +0000 UTC m=+5877.301362395" Mar 08 21:09:36 crc kubenswrapper[4885]: I0308 21:09:36.860416 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:38 crc kubenswrapper[4885]: I0308 21:09:38.772738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:38 crc kubenswrapper[4885]: I0308 21:09:38.773068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:39 crc kubenswrapper[4885]: I0308 21:09:39.835572 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-m7crp" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" probeResult="failure" output=< Mar 08 21:09:39 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:09:39 crc kubenswrapper[4885]: > Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.670762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.674965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.690403 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790179 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.813463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.011905 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.222631 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.225523 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.237061 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.304085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.304219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.396508 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.450638 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.577217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931157 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2" exitCode=0 Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2"} Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931548 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerStarted","Data":"475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa"} Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.124999 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.796557 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.798373 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.803044 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.808854 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.944284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.944403 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948183 4885 generic.go:334] "Generic (PLEG): container finished" podID="93292c62-a3f4-439e-98fd-85ff17958f38" containerID="4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9" exitCode=0 Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerDied","Data":"4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9"} Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerStarted","Data":"176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.046623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.046803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.047553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.078433 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.176076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.672496 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:44 crc kubenswrapper[4885]: W0308 21:09:44.683006 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef43c9be_bb15_4927_b520_fe1b5ea3cabb.slice/crio-5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92 WatchSource:0}: Error finding container 5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92: Status 404 returned error can't find the container with id 5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92 Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.969207 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31" exitCode=0 Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.969314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.976063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerStarted","Data":"52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.976162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerStarted","Data":"5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92"} Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.018736 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-208d-account-create-update-22tjb" podStartSLOduration=2.018714176 podStartE2EDuration="2.018714176s" podCreationTimestamp="2026-03-08 21:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:45.017214116 +0000 UTC m=+5886.413268159" watchObservedRunningTime="2026-03-08 21:09:45.018714176 +0000 UTC m=+5886.414768209" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.394246 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.580198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"93292c62-a3f4-439e-98fd-85ff17958f38\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.580513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"93292c62-a3f4-439e-98fd-85ff17958f38\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.581300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93292c62-a3f4-439e-98fd-85ff17958f38" (UID: "93292c62-a3f4-439e-98fd-85ff17958f38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.592112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb" (OuterVolumeSpecName: "kube-api-access-nt2tb") pod "93292c62-a3f4-439e-98fd-85ff17958f38" (UID: "93292c62-a3f4-439e-98fd-85ff17958f38"). InnerVolumeSpecName "kube-api-access-nt2tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.682684 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.682722 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.015371 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerID="52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579" exitCode=0 Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.015510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerDied","Data":"52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.021526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerStarted","Data":"da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.025479 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.026488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerDied","Data":"176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.026570 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.054944 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dmxzc" podStartSLOduration=2.596190973 podStartE2EDuration="5.054903474s" podCreationTimestamp="2026-03-08 21:09:41 +0000 UTC" firstStartedPulling="2026-03-08 21:09:42.937113293 +0000 UTC m=+5884.333167316" lastFinishedPulling="2026-03-08 21:09:45.395825754 +0000 UTC m=+5886.791879817" observedRunningTime="2026-03-08 21:09:46.052908551 +0000 UTC m=+5887.448962574" watchObservedRunningTime="2026-03-08 21:09:46.054903474 +0000 UTC m=+5887.450957507" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.411987 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.417314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.417374 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.418275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef43c9be-bb15-4927-b520-fe1b5ea3cabb" (UID: "ef43c9be-bb15-4927-b520-fe1b5ea3cabb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.422034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x" (OuterVolumeSpecName: "kube-api-access-p4l2x") pod "ef43c9be-bb15-4927-b520-fe1b5ea3cabb" (UID: "ef43c9be-bb15-4927-b520-fe1b5ea3cabb"). InnerVolumeSpecName "kube-api-access-p4l2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.519944 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.519975 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052225 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerDied","Data":"5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92"} Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052293 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.068654 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.078518 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.849806 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.934166 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.108698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.398023 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" path="/var/lib/kubelet/pods/8bd0921d-5173-43dd-ac53-0ec3417dce77/volumes" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513126 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:49 crc kubenswrapper[4885]: E0308 21:09:49.513766 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: E0308 21:09:49.513839 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513852 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.514219 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.514262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.515034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.524229 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.568443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.568610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.670367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.670865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.672034 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.700727 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.840720 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.070995 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.072217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.074501 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.075805 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m7crp" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" containerID="cri-o://632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" gracePeriod=2 Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.084436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.179461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.179639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.281792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.281860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.283395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.299766 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: W0308 21:09:50.334162 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d6415b_d535_426e_a500_cd8e25255bde.slice/crio-0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2 WatchSource:0}: Error finding container 0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2: Status 404 returned error can't find the container with id 0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2 Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.336544 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.417017 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.503967 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588867 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588913 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.589819 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities" (OuterVolumeSpecName: "utilities") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.598438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq" (OuterVolumeSpecName: "kube-api-access-xncnq") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "kube-api-access-xncnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.618123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691239 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691282 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691292 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.890114 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: W0308 21:09:50.902969 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddfd3421_88b0_49f2_b94e_fe31c3b5c12f.slice/crio-b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2 WatchSource:0}: Error finding container b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2: Status 404 returned error can't find the container with id b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093620 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" exitCode=0 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.095085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"84590b4da479b1b889ba9e8ed38b36c22bd738109e3b3292bd5c830ce0a9b1f6"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093728 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.095114 4885 scope.go:117] "RemoveContainer" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099020 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4d6415b-d535-426e-a500-cd8e25255bde" containerID="10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de" exitCode=0 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerDied","Data":"10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerStarted","Data":"0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.101296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerStarted","Data":"b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.126540 4885 scope.go:117] "RemoveContainer" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.142034 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-70a3-account-create-update-kvgcv" podStartSLOduration=1.142010954 podStartE2EDuration="1.142010954s" podCreationTimestamp="2026-03-08 21:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:51.135535681 +0000 UTC m=+5892.531589724" watchObservedRunningTime="2026-03-08 21:09:51.142010954 +0000 UTC m=+5892.538064987" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.162448 4885 scope.go:117] "RemoveContainer" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.172190 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.184093 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.210529 4885 scope.go:117] "RemoveContainer" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.211339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": container with ID starting with 632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf not found: ID does not exist" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211390 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} err="failed to get container status \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": rpc error: code = NotFound desc = could not find container \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": container with ID starting with 632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211424 4885 scope.go:117] "RemoveContainer" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.211945 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": container with ID starting with 37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b not found: ID does not exist" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211996 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} err="failed to get container status \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": rpc error: code = NotFound desc = could not find container \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": container with ID starting with 37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.212057 4885 scope.go:117] "RemoveContainer" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.212491 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": container with ID starting with 2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1 not found: ID does not exist" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.212531 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1"} err="failed to get container status \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": rpc error: code = NotFound desc = could not find container \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": container with ID starting with 2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1 not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.387209 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" path="/var/lib/kubelet/pods/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23/volumes" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.012081 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.012145 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.095824 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.121518 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerID="787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe" exitCode=0 Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.121722 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerDied","Data":"787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe"} Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.211600 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.538527 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.633092 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"c4d6415b-d535-426e-a500-cd8e25255bde\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.633219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"c4d6415b-d535-426e-a500-cd8e25255bde\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.635116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4d6415b-d535-426e-a500-cd8e25255bde" (UID: "c4d6415b-d535-426e-a500-cd8e25255bde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.639889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5" (OuterVolumeSpecName: "kube-api-access-c46l5") pod "c4d6415b-d535-426e-a500-cd8e25255bde" (UID: "c4d6415b-d535-426e-a500-cd8e25255bde"). InnerVolumeSpecName "kube-api-access-c46l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.735894 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.735986 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerDied","Data":"0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2"} Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140401 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140332 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.512850 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.570266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.651762 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.651821 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.652462 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" (UID: "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.661122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx" (OuterVolumeSpecName: "kube-api-access-h9ftx") pod "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" (UID: "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f"). InnerVolumeSpecName "kube-api-access-h9ftx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.753437 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.753485 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerDied","Data":"b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2"} Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156266 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156645 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dmxzc" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" containerID="cri-o://da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" gracePeriod=2 Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.174550 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" exitCode=0 Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.174770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038"} Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.175658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa"} Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.175740 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.215533 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.284881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.285487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.285825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.286826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities" (OuterVolumeSpecName: "utilities") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.303183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2" (OuterVolumeSpecName: "kube-api-access-cjrg2") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "kube-api-access-cjrg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388345 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388759 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388797 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388810 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492247 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492697 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492719 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492735 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492745 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492766 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492774 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492783 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492792 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492805 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492832 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492839 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492855 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492864 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492893 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493465 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493488 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493511 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493522 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.495125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.499436 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.499650 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.500452 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-xz7gf" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.519222 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592315 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.693985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.695362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699241 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.819238 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.183295 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.233969 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.260759 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:56 crc kubenswrapper[4885]: E0308 21:09:56.309471 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac39ed11_7986_48ee_adf8_aa9e4b653bb7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac39ed11_7986_48ee_adf8_aa9e4b653bb7.slice/crio-475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa\": RecentStats: unable to find data in memory cache]" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.456575 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:56 crc kubenswrapper[4885]: W0308 21:09:56.467900 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fef7207_0a04_4fb4_af9e_d9efcd13226f.slice/crio-97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5 WatchSource:0}: Error finding container 97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5: Status 404 returned error can't find the container with id 97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5 Mar 08 21:09:57 crc kubenswrapper[4885]: I0308 21:09:57.192409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5"} Mar 08 21:09:57 crc kubenswrapper[4885]: I0308 21:09:57.377664 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" path="/var/lib/kubelet/pods/ac39ed11-7986-48ee-adf8-aa9e4b653bb7/volumes" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.144952 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.146313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.148557 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.148703 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.149758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.163818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.302940 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.404529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.444685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.476693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:06 crc kubenswrapper[4885]: W0308 21:10:06.133739 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4497ae4b_d188_4afa_9546_11fbe209a9a7.slice/crio-25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898 WatchSource:0}: Error finding container 25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898: Status 404 returned error can't find the container with id 25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898 Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.135763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.201986 4885 scope.go:117] "RemoveContainer" containerID="b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.236990 4885 scope.go:117] "RemoveContainer" containerID="7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.280129 4885 scope.go:117] "RemoveContainer" containerID="3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.300721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028"} Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.303465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerStarted","Data":"25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898"} Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.335453 4885 scope.go:117] "RemoveContainer" containerID="1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.356865 4885 scope.go:117] "RemoveContainer" containerID="2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.379166 4885 scope.go:117] "RemoveContainer" containerID="de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.327783 4885 generic.go:334] "Generic (PLEG): container finished" podID="4fef7207-0a04-4fb4-af9e-d9efcd13226f" containerID="0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028" exitCode=0 Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.327884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerDied","Data":"0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328339 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"6ae10a00f19c61478c4a9ec9eb56415ace500c6485074ba5682ba6415e59ce20"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328375 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"6079820a59a3b2673729c7a6d46f50629c38688217d9ad16fcbf0cc730077bab"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.347227 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-754cf98f97-rw6hg" podStartSLOduration=2.94644817 podStartE2EDuration="12.347207898s" podCreationTimestamp="2026-03-08 21:09:55 +0000 UTC" firstStartedPulling="2026-03-08 21:09:56.469926757 +0000 UTC m=+5897.865980780" lastFinishedPulling="2026-03-08 21:10:05.870686475 +0000 UTC m=+5907.266740508" observedRunningTime="2026-03-08 21:10:07.34540255 +0000 UTC m=+5908.741456573" watchObservedRunningTime="2026-03-08 21:10:07.347207898 +0000 UTC m=+5908.743261931" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.008279 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jmft" podUID="00348ab8-7686-4e8d-bada-3d9e32edca19" containerName="ovn-controller" probeResult="failure" output=< Mar 08 21:10:08 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 21:10:08 crc kubenswrapper[4885]: > Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.034300 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.040646 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.150001 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.151063 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.153390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.161205 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.178063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.178112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280167 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280892 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.282489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.310111 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.339465 4885 generic.go:334] "Generic (PLEG): container finished" podID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerID="81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5" exitCode=0 Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.339511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerDied","Data":"81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5"} Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.524831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:09 crc kubenswrapper[4885]: W0308 21:10:09.072177 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae101502_72b5_4462_b83f_cb263bcda010.slice/crio-6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8 WatchSource:0}: Error finding container 6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8: Status 404 returned error can't find the container with id 6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8 Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.091253 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.353398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerStarted","Data":"6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8"} Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.754008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.812272 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"4497ae4b-d188-4afa-9546-11fbe209a9a7\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.822041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp" (OuterVolumeSpecName: "kube-api-access-s94lp") pod "4497ae4b-d188-4afa-9546-11fbe209a9a7" (UID: "4497ae4b-d188-4afa-9546-11fbe209a9a7"). InnerVolumeSpecName "kube-api-access-s94lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.914491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.364459 4885 generic.go:334] "Generic (PLEG): container finished" podID="ae101502-72b5-4462-b83f-cb263bcda010" containerID="b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5" exitCode=0 Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.364752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerDied","Data":"b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5"} Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerDied","Data":"25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898"} Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366460 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.848504 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.860145 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.403856 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b5a791-d720-4a5c-9138-abe584a56755" path="/var/lib/kubelet/pods/94b5a791-d720-4a5c-9138-abe584a56755/volumes" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.757110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853872 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854168 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854724 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run" (OuterVolumeSpecName: "var-run") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.856550 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts" (OuterVolumeSpecName: "scripts") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.857005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.857356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.866141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m" (OuterVolumeSpecName: "kube-api-access-67s7m") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "kube-api-access-67s7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959072 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959114 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959123 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959131 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959141 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959150 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.281427 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:12 crc kubenswrapper[4885]: E0308 21:10:12.281982 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: E0308 21:10:12.282034 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282043 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282257 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.283684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287202 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287423 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.296316 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerDied","Data":"6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8"} Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400567 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.465723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.465874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.466196 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.466225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568492 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.569236 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.575062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.575501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.609839 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.885080 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.957138 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.976338 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.978131 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.981154 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.986276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.012056 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5jmft" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.077803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.077873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.180403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.180455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.181079 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.185680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.228271 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:13 crc kubenswrapper[4885]: W0308 21:10:13.243468 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod956e4845_c662_402d_adb6_b05143af6570.slice/crio-3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16 WatchSource:0}: Error finding container 3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16: Status 404 returned error can't find the container with id 3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16 Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.314520 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.327521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.380225 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae101502-72b5-4462-b83f-cb263bcda010" path="/var/lib/kubelet/pods/ae101502-72b5-4462-b83f-cb263bcda010/volumes" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.437720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16"} Mar 08 21:10:13 crc kubenswrapper[4885]: W0308 21:10:13.754470 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302e26ba_0f77_4f06_a12e_74888dfc7821.slice/crio-93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72 WatchSource:0}: Error finding container 93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72: Status 404 returned error can't find the container with id 93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72 Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.760167 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:14 crc kubenswrapper[4885]: I0308 21:10:14.480982 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerStarted","Data":"93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72"} Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.515263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c"} Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.875989 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.878275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.881159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893348 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893612 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.895509 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995602 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.996396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.006683 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.007611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.009514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.209360 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.825022 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:18 crc kubenswrapper[4885]: W0308 21:10:18.842481 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020ad790_8a8c_4e05_b3da_d6b823bb37e2.slice/crio-8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76 WatchSource:0}: Error finding container 8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76: Status 404 returned error can't find the container with id 8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.537194 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e4845-c662-402d-adb6-b05143af6570" containerID="c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c" exitCode=0 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.537281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerDied","Data":"c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c"} Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540452 4885 generic.go:334] "Generic (PLEG): container finished" podID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerID="952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a" exitCode=0 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a"} Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerStarted","Data":"8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76"} Mar 08 21:10:20 crc kubenswrapper[4885]: I0308 21:10:20.552641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerStarted","Data":"3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84"} Mar 08 21:10:20 crc kubenswrapper[4885]: I0308 21:10:20.579211 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-bvj5k" podStartSLOduration=3.579187778 podStartE2EDuration="3.579187778s" podCreationTimestamp="2026-03-08 21:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:10:20.575252193 +0000 UTC m=+5921.971306216" watchObservedRunningTime="2026-03-08 21:10:20.579187778 +0000 UTC m=+5921.975241831" Mar 08 21:10:22 crc kubenswrapper[4885]: I0308 21:10:22.574168 4885 generic.go:334] "Generic (PLEG): container finished" podID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerID="3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84" exitCode=0 Mar 08 21:10:22 crc kubenswrapper[4885]: I0308 21:10:22.574454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.053737 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.137914 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138087 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138295 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.143622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data" (OuterVolumeSpecName: "config-data") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.145514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts" (OuterVolumeSpecName: "scripts") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.165214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.166066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240806 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240863 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240882 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240895 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.592846 4885 generic.go:334] "Generic (PLEG): container finished" podID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" exitCode=0 Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.592958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.596972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.597031 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.597114 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.650974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerStarted","Data":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.659467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"c03d4c72c49695c50f6e72d74250532258d64ecd15f5de57f003ea28da91d6df"} Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.660364 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.687381 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" podStartSLOduration=3.788861104 podStartE2EDuration="13.687360099s" podCreationTimestamp="2026-03-08 21:10:12 +0000 UTC" firstStartedPulling="2026-03-08 21:10:13.762119545 +0000 UTC m=+5915.158173578" lastFinishedPulling="2026-03-08 21:10:23.66061855 +0000 UTC m=+5925.056672573" observedRunningTime="2026-03-08 21:10:25.675502523 +0000 UTC m=+5927.071556546" watchObservedRunningTime="2026-03-08 21:10:25.687360099 +0000 UTC m=+5927.083414142" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.725901 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-b8ndv" podStartSLOduration=2.3416002320000002 podStartE2EDuration="13.725879357s" podCreationTimestamp="2026-03-08 21:10:12 +0000 UTC" firstStartedPulling="2026-03-08 21:10:13.245886565 +0000 UTC m=+5914.641940588" lastFinishedPulling="2026-03-08 21:10:24.63016568 +0000 UTC m=+5926.026219713" observedRunningTime="2026-03-08 21:10:25.714839912 +0000 UTC m=+5927.110893945" watchObservedRunningTime="2026-03-08 21:10:25.725879357 +0000 UTC m=+5927.121933380" Mar 08 21:10:29 crc kubenswrapper[4885]: I0308 21:10:29.626556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:29 crc kubenswrapper[4885]: I0308 21:10:29.858968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:42 crc kubenswrapper[4885]: I0308 21:10:42.640804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.188057 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.188995 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" containerID="cri-o://187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" gracePeriod=30 Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.747471 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.855664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"302e26ba-0f77-4f06-a12e-74888dfc7821\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.855794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"302e26ba-0f77-4f06-a12e-74888dfc7821\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.883260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "302e26ba-0f77-4f06-a12e-74888dfc7821" (UID: "302e26ba-0f77-4f06-a12e-74888dfc7821"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.926772 4885 generic.go:334] "Generic (PLEG): container finished" podID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" exitCode=0 Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72"} Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927281 4885 scope.go:117] "RemoveContainer" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927483 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.947487 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "302e26ba-0f77-4f06-a12e-74888dfc7821" (UID: "302e26ba-0f77-4f06-a12e-74888dfc7821"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.957817 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.957847 4885 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.970205 4885 scope.go:117] "RemoveContainer" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991304 4885 scope.go:117] "RemoveContainer" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: E0308 21:10:50.991807 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": container with ID starting with 187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338 not found: ID does not exist" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} err="failed to get container status \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": rpc error: code = NotFound desc = could not find container \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": container with ID starting with 187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338 not found: ID does not exist" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991894 4885 scope.go:117] "RemoveContainer" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: E0308 21:10:50.992424 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": container with ID starting with 83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40 not found: ID does not exist" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.992477 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40"} err="failed to get container status \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": rpc error: code = NotFound desc = could not find container \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": container with ID starting with 83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40 not found: ID does not exist" Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.278483 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.295188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.390052 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" path="/var/lib/kubelet/pods/302e26ba-0f77-4f06-a12e-74888dfc7821/volumes" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.465744 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.466869 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.466890 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.466958 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.466971 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.467000 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.467038 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467051 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467373 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467398 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.469092 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.472118 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.490762 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.534874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.534948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.636824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.636882 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.637479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.647042 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.803513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.263494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.999039 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0"} Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.999429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"13b44627660c6d5828e1ffce08afc4d5b640263a51ff23dacea929e4384d65bb"} Mar 08 21:10:57 crc kubenswrapper[4885]: I0308 21:10:57.021278 4885 generic.go:334] "Generic (PLEG): container finished" podID="6b1c35f0-ed5f-411a-a0ec-1270fd04e266" containerID="565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0" exitCode=0 Mar 08 21:10:57 crc kubenswrapper[4885]: I0308 21:10:57.021882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerDied","Data":"565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0"} Mar 08 21:10:58 crc kubenswrapper[4885]: I0308 21:10:58.036360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"74683b96c32b01537600c9e68d79b2282c47487f6c8eae71ca83b94efdde9d7f"} Mar 08 21:10:58 crc kubenswrapper[4885]: I0308 21:10:58.059562 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" podStartSLOduration=3.611811667 podStartE2EDuration="4.05954289s" podCreationTimestamp="2026-03-08 21:10:54 +0000 UTC" firstStartedPulling="2026-03-08 21:10:55.269733446 +0000 UTC m=+5956.665787479" lastFinishedPulling="2026-03-08 21:10:55.717464649 +0000 UTC m=+5957.113518702" observedRunningTime="2026-03-08 21:10:58.056748845 +0000 UTC m=+5959.452802868" watchObservedRunningTime="2026-03-08 21:10:58.05954289 +0000 UTC m=+5959.455596913" Mar 08 21:11:06 crc kubenswrapper[4885]: I0308 21:11:06.555669 4885 scope.go:117] "RemoveContainer" containerID="d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.529197 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.531493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.535085 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.535420 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.542071 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.550108 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602757 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.603017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.705802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.705986 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706318 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.710818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.712160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.715093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.717582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.717637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.719990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.855641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:17 crc kubenswrapper[4885]: I0308 21:11:17.696414 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.194930 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.198013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.208934 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.214339 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.215498 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239743 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.255435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"c5956a80a01d11ccb179dc4eae96a4ea0e3a4f92806db99072d0278c874aeb7b"} Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341301 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.342699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.344157 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.346889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.347065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.354501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.354761 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.527224 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.106703 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.265957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0"} Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.267341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"fd030b5bbf532bf3f09a68982b303a7feff3c04c76910c861b175be15ef9b193"} Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.769971 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.772675 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787381 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787578 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787691 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.873382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.873804 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874053 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.979040 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.986895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.987140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.988113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.991186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.992175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.127693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8847z" Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.312194 4885 generic.go:334] "Generic (PLEG): container finished" podID="9d4d983f-9ee9-4341-bf69-0c2fc610a2d6" containerID="14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0" exitCode=0 Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.312250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerDied","Data":"14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0"} Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.800508 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.444780 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.445321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"1b4bf725669d42dcefe21ca4568a95968ca89f01d4de0ce4713d48355980145e"} Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.446018 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"892302d5823444d7c1ba78dc7ffe8568afc3a073b1628b7312df055cd88abc0b"} Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.486947 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-8t2fl" podStartSLOduration=5.486916562 podStartE2EDuration="5.486916562s" podCreationTimestamp="2026-03-08 21:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:21.46211211 +0000 UTC m=+5982.858166153" watchObservedRunningTime="2026-03-08 21:11:21.486916562 +0000 UTC m=+5982.882970585" Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.563396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:22 crc kubenswrapper[4885]: I0308 21:11:22.461105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903"} Mar 08 21:11:23 crc kubenswrapper[4885]: I0308 21:11:23.480226 4885 generic.go:334] "Generic (PLEG): container finished" podID="c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8" containerID="20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903" exitCode=0 Mar 08 21:11:23 crc kubenswrapper[4885]: I0308 21:11:23.480324 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerDied","Data":"20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.489346 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.496175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"710a8f5cb5fdacb2d1629e83c91473191cd887af35339771a85b6a6fc89819fb"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.496399 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.532158 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-pchrs" podStartSLOduration=4.068948046 podStartE2EDuration="6.532142808s" podCreationTimestamp="2026-03-08 21:11:18 +0000 UTC" firstStartedPulling="2026-03-08 21:11:19.106289862 +0000 UTC m=+5980.502343885" lastFinishedPulling="2026-03-08 21:11:21.569484624 +0000 UTC m=+5982.965538647" observedRunningTime="2026-03-08 21:11:24.528273444 +0000 UTC m=+5985.924327467" watchObservedRunningTime="2026-03-08 21:11:24.532142808 +0000 UTC m=+5985.928196831" Mar 08 21:11:25 crc kubenswrapper[4885]: I0308 21:11:25.509644 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c0bddea-5630-4e74-8bc9-ec81fc3eba56" containerID="5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0" exitCode=0 Mar 08 21:11:25 crc kubenswrapper[4885]: I0308 21:11:25.509703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerDied","Data":"5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0"} Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.524700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"1b6dfb0be525ff4b81354acb4f385146a928027992b7a317dbe0994268d40bb9"} Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.525018 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-8847z" Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.552911 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-8847z" podStartSLOduration=4.84498176 podStartE2EDuration="7.552888148s" podCreationTimestamp="2026-03-08 21:11:19 +0000 UTC" firstStartedPulling="2026-03-08 21:11:20.813156441 +0000 UTC m=+5982.209210464" lastFinishedPulling="2026-03-08 21:11:23.521062809 +0000 UTC m=+5984.917116852" observedRunningTime="2026-03-08 21:11:26.5451354 +0000 UTC m=+5987.941189473" watchObservedRunningTime="2026-03-08 21:11:26.552888148 +0000 UTC m=+5987.948942171" Mar 08 21:11:31 crc kubenswrapper[4885]: I0308 21:11:31.888682 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:33 crc kubenswrapper[4885]: I0308 21:11:33.564715 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:35 crc kubenswrapper[4885]: I0308 21:11:35.202379 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-8847z" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.009589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.011665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.036570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.036999 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.037506 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-dwvls" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.043836 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.076432 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.088624 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.092092 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" containerID="cri-o://393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.092439 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" containerID="cri-o://45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101296 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101363 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101430 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.137943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.138386 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" containerID="cri-o://6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.138871 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" containerID="cri-o://77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.159676 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.161886 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.179613 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203758 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.204431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.205262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.212453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.222961 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.306069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.377414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410326 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411318 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.412393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.412403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.414815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.434485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.479120 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.682688 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.709866 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.711512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.739726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.784671 4885 generic.go:334] "Generic (PLEG): container finished" podID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerID="6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" exitCode=143 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.785020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158"} Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.786639 4885 generic.go:334] "Generic (PLEG): container finished" podID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerID="393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" exitCode=143 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.786661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033"} Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.817873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.817993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.923621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.923903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.925243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.925543 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.927105 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.930176 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: W0308 21:11:45.933169 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc58da53b_fda0_486e_b7d0_d6b50bfc1b62.slice/crio-3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8 WatchSource:0}: Error finding container 3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8: Status 404 returned error can't find the container with id 3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.936498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.944320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.035856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.043869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.535052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.799871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8"} Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.802188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145"} Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.804171 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"e892f970f210beda78d54563132ede0defb9bfa40842e20b8ca03cfb2cdffe13"} Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.853247 4885 generic.go:334] "Generic (PLEG): container finished" podID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerID="45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" exitCode=0 Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.853338 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4"} Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.856107 4885 generic.go:334] "Generic (PLEG): container finished" podID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerID="77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" exitCode=0 Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.856146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.488333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.496674 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.519096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.528424 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t" (OuterVolumeSpecName: "kube-api-access-xhs2t") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "kube-api-access-xhs2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628337 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628360 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628403 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628637 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628768 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.629254 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.631912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs" (OuterVolumeSpecName: "logs") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.632663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.633060 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs" (OuterVolumeSpecName: "logs") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.633080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.638817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph" (OuterVolumeSpecName: "ceph") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.639044 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph" (OuterVolumeSpecName: "ceph") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.640592 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts" (OuterVolumeSpecName: "scripts") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.648376 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv" (OuterVolumeSpecName: "kube-api-access-6qvkv") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "kube-api-access-6qvkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.648625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts" (OuterVolumeSpecName: "scripts") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.674294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.706228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731378 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731404 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731413 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731421 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731429 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731438 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731448 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731455 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731463 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731472 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731481 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.754964 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data" (OuterVolumeSpecName: "config-data") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.758967 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data" (OuterVolumeSpecName: "config-data") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.833262 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.833299 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"8a73a30c989c3b2ba3367e2ccdac633bed1d9b688e4b2e65328b8a2f07a6fe3b"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927486 4885 scope.go:117] "RemoveContainer" containerID="45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.931615 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.936283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.951471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.951505 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.954582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.966372 4885 scope.go:117] "RemoveContainer" containerID="393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.997805 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.031570 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.054230 4885 scope.go:117] "RemoveContainer" containerID="77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.059787 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.069526 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.078631 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079111 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079130 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079157 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079182 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079202 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079208 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079409 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079427 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079438 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079447 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.080460 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083642 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083759 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083784 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.085740 4885 scope.go:117] "RemoveContainer" containerID="6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.089393 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.091129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.092860 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.108620 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.130975 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.268918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269128 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269410 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269463 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371024 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371068 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371185 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371715 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.373563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.374278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.376757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.376991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.377656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.378305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.379303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.381632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.382437 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.384272 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" path="/var/lib/kubelet/pods/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e/volumes" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.385308 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" path="/var/lib/kubelet/pods/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8/volumes" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.390297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.391153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.392599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.408430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.416408 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.972343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa"} Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.973081 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86f4656c87-zrnj4" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" containerID="cri-o://3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" gracePeriod=30 Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.973611 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86f4656c87-zrnj4" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" containerID="cri-o://b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" gracePeriod=30 Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.983170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959"} Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.989451 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.009369 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86f4656c87-zrnj4" podStartSLOduration=3.50976765 podStartE2EDuration="12.009348599s" podCreationTimestamp="2026-03-08 21:11:44 +0000 UTC" firstStartedPulling="2026-03-08 21:11:45.935161736 +0000 UTC m=+6007.331215759" lastFinishedPulling="2026-03-08 21:11:54.434742645 +0000 UTC m=+6015.830796708" observedRunningTime="2026-03-08 21:11:55.991702146 +0000 UTC m=+6017.387756159" watchObservedRunningTime="2026-03-08 21:11:56.009348599 +0000 UTC m=+6017.405402612" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.024162 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cfbd9754f-492lw" podStartSLOduration=2.618126808 podStartE2EDuration="11.024141686s" podCreationTimestamp="2026-03-08 21:11:45 +0000 UTC" firstStartedPulling="2026-03-08 21:11:46.063066997 +0000 UTC m=+6007.459121020" lastFinishedPulling="2026-03-08 21:11:54.469081835 +0000 UTC m=+6015.865135898" observedRunningTime="2026-03-08 21:11:56.016657065 +0000 UTC m=+6017.412711078" watchObservedRunningTime="2026-03-08 21:11:56.024141686 +0000 UTC m=+6017.420195709" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039256 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039315 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dbdd8c5b9-56mvx" podStartSLOduration=3.141853761 podStartE2EDuration="11.039297732s" podCreationTimestamp="2026-03-08 21:11:45 +0000 UTC" firstStartedPulling="2026-03-08 21:11:46.559236032 +0000 UTC m=+6007.955290055" lastFinishedPulling="2026-03-08 21:11:54.456679963 +0000 UTC m=+6015.852734026" observedRunningTime="2026-03-08 21:11:56.039107577 +0000 UTC m=+6017.435161610" watchObservedRunningTime="2026-03-08 21:11:56.039297732 +0000 UTC m=+6017.435351755" Mar 08 21:11:56 crc kubenswrapper[4885]: W0308 21:11:56.056056 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd58de31_5f82_4acb_8713_397027fbae4f.slice/crio-95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1 WatchSource:0}: Error finding container 95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1: Status 404 returned error can't find the container with id 95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1 Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.076407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.868153 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:56 crc kubenswrapper[4885]: W0308 21:11:56.873382 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1efb870_06f3_40b8_baca_e418a034eaed.slice/crio-1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb WatchSource:0}: Error finding container 1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb: Status 404 returned error can't find the container with id 1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.007296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb"} Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.009674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"ff00fc7a32001b8b633a5ecb4d4224b2a0c15491ae5712a09f7af91239ca81bf"} Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.009721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.025492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"7e19df0da4127f0cc92522bd74716ccab4a8247e1c70a6b805f7b3d0e7dd4f4d"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.031871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"30d1be3af3da185dd6d8f4b82fc8ebb16839644c9ecf7f387b6e417124e194a0"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.031901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"bca57804f3d85629f3ba71e2ac77cc012e945329d5dcd93c84975a1075cd1068"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.080037 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.08001641 podStartE2EDuration="4.08001641s" podCreationTimestamp="2026-03-08 21:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:58.065360357 +0000 UTC m=+6019.461414390" watchObservedRunningTime="2026-03-08 21:11:58.08001641 +0000 UTC m=+6019.476070433" Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.080895 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.080889513 podStartE2EDuration="4.080889513s" podCreationTimestamp="2026-03-08 21:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:58.046795419 +0000 UTC m=+6019.442849442" watchObservedRunningTime="2026-03-08 21:11:58.080889513 +0000 UTC m=+6019.476943536" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.148503 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.151342 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.153231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.154318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.156130 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.160334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.274712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.377452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.411449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.469175 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:01 crc kubenswrapper[4885]: W0308 21:12:01.014839 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4225aab0_53fa_4aa3_ac19_6827ea262916.slice/crio-a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847 WatchSource:0}: Error finding container a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847: Status 404 returned error can't find the container with id a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847 Mar 08 21:12:01 crc kubenswrapper[4885]: I0308 21:12:01.014880 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:01 crc kubenswrapper[4885]: I0308 21:12:01.070015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerStarted","Data":"a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847"} Mar 08 21:12:02 crc kubenswrapper[4885]: I0308 21:12:02.818990 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:12:02 crc kubenswrapper[4885]: I0308 21:12:02.819468 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:12:03 crc kubenswrapper[4885]: I0308 21:12:03.106850 4885 generic.go:334] "Generic (PLEG): container finished" podID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerID="37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8" exitCode=0 Mar 08 21:12:03 crc kubenswrapper[4885]: I0308 21:12:03.106914 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerDied","Data":"37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8"} Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.546722 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.681451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"4225aab0-53fa-4aa3-ac19-6827ea262916\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.688060 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t" (OuterVolumeSpecName: "kube-api-access-6hq2t") pod "4225aab0-53fa-4aa3-ac19-6827ea262916" (UID: "4225aab0-53fa-4aa3-ac19-6827ea262916"). InnerVolumeSpecName "kube-api-access-6hq2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.785117 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerDied","Data":"a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847"} Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132791 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132883 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.386903 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.408955 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.408988 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.418220 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.418252 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.453010 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.472147 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.473636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.481842 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.481891 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.487117 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.504987 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.691976 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.729103 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.038658 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146874 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146977 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146988 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.680945 4885 scope.go:117] "RemoveContainer" containerID="b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" Mar 08 21:12:07 crc kubenswrapper[4885]: I0308 21:12:07.384551 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" path="/var/lib/kubelet/pods/ddbd1248-e534-4251-b5a6-0505b7710e6e/volumes" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.209102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.209893 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.234563 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.234638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.238536 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.043174 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.056936 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.065530 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.073669 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:12:13 crc kubenswrapper[4885]: I0308 21:12:13.387720 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379d344d-9828-4fae-a4f4-5712113f506d" path="/var/lib/kubelet/pods/379d344d-9828-4fae-a4f4-5712113f506d/volumes" Mar 08 21:12:13 crc kubenswrapper[4885]: I0308 21:12:13.389353 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" path="/var/lib/kubelet/pods/df6b9c40-d823-4cb8-aadd-4f2aee7bd899/volumes" Mar 08 21:12:17 crc kubenswrapper[4885]: I0308 21:12:17.276509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:17 crc kubenswrapper[4885]: I0308 21:12:17.756657 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:12:18 crc kubenswrapper[4885]: I0308 21:12:18.940574 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.051907 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.063069 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.317099 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.397454 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" path="/var/lib/kubelet/pods/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46/volumes" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.409775 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.410069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" containerID="cri-o://71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" gracePeriod=30 Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.410209 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" containerID="cri-o://e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" gracePeriod=30 Mar 08 21:12:21 crc kubenswrapper[4885]: I0308 21:12:21.171819 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.86:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:21 crc kubenswrapper[4885]: I0308 21:12:21.171909 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.86:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.237653 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.87:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.237722 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.87:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.378882 4885 generic.go:334] "Generic (PLEG): container finished" podID="89778e39-b609-494b-b2b2-aebf98447dd0" containerID="e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" exitCode=0 Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.410587 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959"} Mar 08 21:12:25 crc kubenswrapper[4885]: I0308 21:12:25.480600 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442411 4885 generic.go:334] "Generic (PLEG): container finished" podID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerID="b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" exitCode=137 Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442775 4885 generic.go:334] "Generic (PLEG): container finished" podID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerID="3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" exitCode=137 Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442848 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.506859 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632212 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632329 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.635514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs" (OuterVolumeSpecName: "logs") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.640218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.652169 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn" (OuterVolumeSpecName: "kube-api-access-xmnvn") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "kube-api-access-xmnvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.662149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data" (OuterVolumeSpecName: "config-data") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.664287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts" (OuterVolumeSpecName: "scripts") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734025 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734060 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734073 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734081 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734090 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.455588 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.496878 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.509834 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:12:29 crc kubenswrapper[4885]: I0308 21:12:29.387361 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" path="/var/lib/kubelet/pods/c58da53b-fda0-486e-b7d0-d6b50bfc1b62/volumes" Mar 08 21:12:32 crc kubenswrapper[4885]: I0308 21:12:32.818392 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:12:32 crc kubenswrapper[4885]: I0308 21:12:32.819098 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:12:35 crc kubenswrapper[4885]: I0308 21:12:35.480198 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:45 crc kubenswrapper[4885]: I0308 21:12:45.480519 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:45 crc kubenswrapper[4885]: I0308 21:12:45.481513 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.061889 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.078227 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.087305 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.096299 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.381374 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" path="/var/lib/kubelet/pods/0e0914c1-cac8-4c2d-bbe4-615218170f10/volumes" Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.382433 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" path="/var/lib/kubelet/pods/d6ea4544-00f0-4646-a598-1efa92af4e49/volumes" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.719139 4885 generic.go:334] "Generic (PLEG): container finished" podID="89778e39-b609-494b-b2b2-aebf98447dd0" containerID="71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" exitCode=137 Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.719181 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51"} Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.880456 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.991471 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs" (OuterVolumeSpecName: "logs") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.997263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl" (OuterVolumeSpecName: "kube-api-access-pm7hl") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "kube-api-access-pm7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.014744 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts" (OuterVolumeSpecName: "scripts") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.022674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.023886 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data" (OuterVolumeSpecName: "config-data") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092450 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092484 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092499 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092512 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092523 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.733962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145"} Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.734024 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.734048 4885 scope.go:117] "RemoveContainer" containerID="e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.783795 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.792664 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:50 crc kubenswrapper[4885]: E0308 21:12:50.920545 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89778e39_b609_494b_b2b2_aebf98447dd0.slice/crio-e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145\": RecentStats: unable to find data in memory cache]" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.981028 4885 scope.go:117] "RemoveContainer" containerID="71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" Mar 08 21:12:51 crc kubenswrapper[4885]: I0308 21:12:51.386305 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" path="/var/lib/kubelet/pods/89778e39-b609-494b-b2b2-aebf98447dd0/volumes" Mar 08 21:12:56 crc kubenswrapper[4885]: I0308 21:12:56.049798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:12:56 crc kubenswrapper[4885]: I0308 21:12:56.071094 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:12:57 crc kubenswrapper[4885]: I0308 21:12:57.386006 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" path="/var/lib/kubelet/pods/faeab210-5195-4d9a-a17e-5aed2f14dc68/volumes" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.521824 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.524615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.524746 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.524881 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525018 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525142 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525257 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525467 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525581 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526400 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526540 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526685 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526805 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526909 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.528779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.548115 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596304 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.697934 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698856 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.699028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.699534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.704484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.730729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.839971 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840027 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840084 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840882 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840961 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" gracePeriod=600 Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.851449 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:03 crc kubenswrapper[4885]: E0308 21:13:03.001911 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.336652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:03 crc kubenswrapper[4885]: W0308 21:13:03.338263 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24559d3_3f44_434a_b790_32c52475d532.slice/crio-1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d WatchSource:0}: Error finding container 1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d: Status 404 returned error can't find the container with id 1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.811231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.814006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.896664 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.898049 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.901873 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905002 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"7de957ed968b504a9b250ecaaa70845cef18ee78b3fab7c8c4348d16ba4ab56c"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"2825c1cab2f6626bb6e17a99f19c671d0f0d4d3ae3037cdf4c42e846c7933c14"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.906146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910006 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" exitCode=0 Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910101 4885 scope.go:117] "RemoveContainer" containerID="f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.911022 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:03 crc kubenswrapper[4885]: E0308 21:13:03.911315 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.920994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921202 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.922818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.936380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b7cfb69fc-bhpx4" podStartSLOduration=1.93636371 podStartE2EDuration="1.93636371s" podCreationTimestamp="2026-03-08 21:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:03.93261554 +0000 UTC m=+6085.328669563" watchObservedRunningTime="2026-03-08 21:13:03.93636371 +0000 UTC m=+6085.332417733" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.948418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.022639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.022766 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.023650 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.046262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.143775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.307389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.509454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:04 crc kubenswrapper[4885]: W0308 21:13:04.834944 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49449c65_7a7c_437f_b4d9_23b2a219485f.slice/crio-2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2 WatchSource:0}: Error finding container 2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2: Status 404 returned error can't find the container with id 2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2 Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.844894 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.922125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerStarted","Data":"2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.924703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerStarted","Data":"9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.924749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerStarted","Data":"0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.943061 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-rstwd" podStartSLOduration=1.943046512 podStartE2EDuration="1.943046512s" podCreationTimestamp="2026-03-08 21:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:04.939528778 +0000 UTC m=+6086.335582801" watchObservedRunningTime="2026-03-08 21:13:04.943046512 +0000 UTC m=+6086.339100535" Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.961008 4885 generic.go:334] "Generic (PLEG): container finished" podID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerID="d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21" exitCode=0 Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.961140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerDied","Data":"d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21"} Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.968070 4885 generic.go:334] "Generic (PLEG): container finished" podID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerID="9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782" exitCode=0 Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.968117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerDied","Data":"9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782"} Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.876606 4885 scope.go:117] "RemoveContainer" containerID="e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e" Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.914536 4885 scope.go:117] "RemoveContainer" containerID="005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee" Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.979692 4885 scope.go:117] "RemoveContainer" containerID="f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.081693 4885 scope.go:117] "RemoveContainer" containerID="bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.149717 4885 scope.go:117] "RemoveContainer" containerID="5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.181282 4885 scope.go:117] "RemoveContainer" containerID="0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.235951 4885 scope.go:117] "RemoveContainer" containerID="71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.349966 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.426457 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.506690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.507650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.508233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b956841b-a9a1-4c38-99e9-05c6e5f9f363" (UID: "b956841b-a9a1-4c38-99e9-05c6e5f9f363"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.508734 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.511635 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2" (OuterVolumeSpecName: "kube-api-access-668g2") pod "b956841b-a9a1-4c38-99e9-05c6e5f9f363" (UID: "b956841b-a9a1-4c38-99e9-05c6e5f9f363"). InnerVolumeSpecName "kube-api-access-668g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.610557 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"49449c65-7a7c-437f-b4d9-23b2a219485f\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.610714 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"49449c65-7a7c-437f-b4d9-23b2a219485f\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.611399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49449c65-7a7c-437f-b4d9-23b2a219485f" (UID: "49449c65-7a7c-437f-b4d9-23b2a219485f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.614235 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw" (OuterVolumeSpecName: "kube-api-access-jj2zw") pod "49449c65-7a7c-437f-b4d9-23b2a219485f" (UID: "49449c65-7a7c-437f-b4d9-23b2a219485f"). InnerVolumeSpecName "kube-api-access-jj2zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.620160 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.722427 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.722478 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerDied","Data":"0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a"} Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060649 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060751 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106247 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerDied","Data":"2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2"} Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106303 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106317 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: E0308 21:13:09.033706 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033734 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: E0308 21:13:09.033772 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033784 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.034146 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.034193 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.035237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.038208 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.038720 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2n2s7" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.050769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.272823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.272990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.283514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.367006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.892417 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: W0308 21:13:09.903477 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod495d39cf_6a4d_4ca0_90b6_9a22323d1568.slice/crio-6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a WatchSource:0}: Error finding container 6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a: Status 404 returned error can't find the container with id 6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a Mar 08 21:13:10 crc kubenswrapper[4885]: I0308 21:13:10.137595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerStarted","Data":"6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a"} Mar 08 21:13:12 crc kubenswrapper[4885]: I0308 21:13:12.852161 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:12 crc kubenswrapper[4885]: I0308 21:13:12.852894 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:15 crc kubenswrapper[4885]: I0308 21:13:15.370605 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:15 crc kubenswrapper[4885]: E0308 21:13:15.371513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:19 crc kubenswrapper[4885]: I0308 21:13:19.271163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerStarted","Data":"27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a"} Mar 08 21:13:19 crc kubenswrapper[4885]: I0308 21:13:19.313105 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fs9dx" podStartSLOduration=2.581579234 podStartE2EDuration="11.313078524s" podCreationTimestamp="2026-03-08 21:13:08 +0000 UTC" firstStartedPulling="2026-03-08 21:13:09.907408653 +0000 UTC m=+6091.303462686" lastFinishedPulling="2026-03-08 21:13:18.638907943 +0000 UTC m=+6100.034961976" observedRunningTime="2026-03-08 21:13:19.294734772 +0000 UTC m=+6100.690788815" watchObservedRunningTime="2026-03-08 21:13:19.313078524 +0000 UTC m=+6100.709132577" Mar 08 21:13:21 crc kubenswrapper[4885]: I0308 21:13:21.303856 4885 generic.go:334] "Generic (PLEG): container finished" podID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerID="27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a" exitCode=0 Mar 08 21:13:21 crc kubenswrapper[4885]: I0308 21:13:21.303974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerDied","Data":"27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a"} Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.816064 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931488 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931618 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931673 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.939404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26" (OuterVolumeSpecName: "kube-api-access-v7n26") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "kube-api-access-v7n26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.959542 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.034745 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.034783 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.050351 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data" (OuterVolumeSpecName: "config-data") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.136781 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.332985 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerDied","Data":"6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a"} Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.333044 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.333116 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.577730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:24 crc kubenswrapper[4885]: E0308 21:13:24.580384 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.580408 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.580613 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.581307 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586294 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2n2s7" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586437 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.588528 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675072 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675138 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675182 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675270 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776778 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.784186 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.795036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.798665 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.806400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.815986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.826292 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.827029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.839557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.850276 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.851604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.859832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.866942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878437 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878481 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878519 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.966903 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981488 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.996743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.001552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.002337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.004121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.050975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083746 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.087109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.088307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.095678 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.104323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.229759 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.236573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.529581 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c41cdd1_29dd_4252_b988_1efaeed01573.slice/crio-c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf WatchSource:0}: Error finding container c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf: Status 404 returned error can't find the container with id c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.536953 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.794632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.795286 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979b34eb_586a_4d86_8e2d_7937614c714a.slice/crio-91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea WatchSource:0}: Error finding container 91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea: Status 404 returned error can't find the container with id 91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.816830 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.817019 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78788c18_3ce2_4e27_841d_e7d380fbab71.slice/crio-f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af WatchSource:0}: Error finding container f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af: Status 404 returned error can't find the container with id f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.369023 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:26 crc kubenswrapper[4885]: E0308 21:13:26.369628 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.370972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-796f99d566-r2p9d" event={"ID":"78788c18-3ce2-4e27-841d-e7d380fbab71","Type":"ContainerStarted","Data":"f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.373078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" event={"ID":"979b34eb-586a-4d86-8e2d-7937614c714a","Type":"ContainerStarted","Data":"91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.374695 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-595686fb49-hx4rx" event={"ID":"9c41cdd1-29dd-4252-b988-1efaeed01573","Type":"ContainerStarted","Data":"7f507f8806f3620662ec6c97dfb37b2207335089aa04c38748f1bedd1a21fb14"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.374732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-595686fb49-hx4rx" event={"ID":"9c41cdd1-29dd-4252-b988-1efaeed01573","Type":"ContainerStarted","Data":"c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.376039 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.398884 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-595686fb49-hx4rx" podStartSLOduration=2.398865514 podStartE2EDuration="2.398865514s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:26.395287988 +0000 UTC m=+6107.791342021" watchObservedRunningTime="2026-03-08 21:13:26.398865514 +0000 UTC m=+6107.794919537" Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.016233 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.085818 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.086077 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" containerID="cri-o://321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" gracePeriod=30 Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.086457 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" containerID="cri-o://1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" gracePeriod=30 Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.403053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-796f99d566-r2p9d" event={"ID":"78788c18-3ce2-4e27-841d-e7d380fbab71","Type":"ContainerStarted","Data":"c2da5e55a35b2266f9517072b83648e3551522b93ed4406e1f6e2fcc4d9fad09"} Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.403556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.414894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" event={"ID":"979b34eb-586a-4d86-8e2d-7937614c714a","Type":"ContainerStarted","Data":"2a21c27e79ecf343e59c62664e37c488787c555bf3cd743c9d60f9b0f163d52b"} Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.417431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.442863 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-796f99d566-r2p9d" podStartSLOduration=2.2876640630000002 podStartE2EDuration="5.442839272s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="2026-03-08 21:13:25.824800697 +0000 UTC m=+6107.220854720" lastFinishedPulling="2026-03-08 21:13:28.979975896 +0000 UTC m=+6110.376029929" observedRunningTime="2026-03-08 21:13:29.430168252 +0000 UTC m=+6110.826222305" watchObservedRunningTime="2026-03-08 21:13:29.442839272 +0000 UTC m=+6110.838893295" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.456836 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" podStartSLOduration=2.2790091710000002 podStartE2EDuration="5.456809676s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="2026-03-08 21:13:25.797996839 +0000 UTC m=+6107.194050862" lastFinishedPulling="2026-03-08 21:13:28.975797344 +0000 UTC m=+6110.371851367" observedRunningTime="2026-03-08 21:13:29.447859356 +0000 UTC m=+6110.843913379" watchObservedRunningTime="2026-03-08 21:13:29.456809676 +0000 UTC m=+6110.852863699" Mar 08 21:13:30 crc kubenswrapper[4885]: I0308 21:13:30.430270 4885 generic.go:334] "Generic (PLEG): container finished" podID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" exitCode=0 Mar 08 21:13:30 crc kubenswrapper[4885]: I0308 21:13:30.430328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.037700 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.508707 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.677621 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.058665 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.073635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.084889 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.098271 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.368652 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:38 crc kubenswrapper[4885]: E0308 21:13:38.369075 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:39 crc kubenswrapper[4885]: I0308 21:13:39.388049 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" path="/var/lib/kubelet/pods/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d/volumes" Mar 08 21:13:39 crc kubenswrapper[4885]: I0308 21:13:39.392090 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ace920-d540-4598-82db-315caa467acb" path="/var/lib/kubelet/pods/e7ace920-d540-4598-82db-315caa467acb/volumes" Mar 08 21:13:45 crc kubenswrapper[4885]: I0308 21:13:45.007077 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.039640 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.042115 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.057194 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:13:47 crc kubenswrapper[4885]: I0308 21:13:47.391958 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" path="/var/lib/kubelet/pods/07fa5fd1-f6b9-4206-809d-c1f04533cab4/volumes" Mar 08 21:13:52 crc kubenswrapper[4885]: I0308 21:13:52.369058 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:52 crc kubenswrapper[4885]: E0308 21:13:52.370046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:56 crc kubenswrapper[4885]: I0308 21:13:56.036881 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:56 crc kubenswrapper[4885]: I0308 21:13:56.037585 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.531131 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637418 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637767 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637873 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638053 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs" (OuterVolumeSpecName: "logs") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.639388 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.642848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.644035 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7" (OuterVolumeSpecName: "kube-api-access-x2bq7") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "kube-api-access-x2bq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.666477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data" (OuterVolumeSpecName: "config-data") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.671284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts" (OuterVolumeSpecName: "scripts") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741671 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741702 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741711 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741721 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748501 4885 generic.go:334] "Generic (PLEG): container finished" podID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" exitCode=137 Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748541 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748570 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748591 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"e892f970f210beda78d54563132ede0defb9bfa40842e20b8ca03cfb2cdffe13"} Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748611 4885 scope.go:117] "RemoveContainer" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.781296 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.788692 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.951165 4885 scope.go:117] "RemoveContainer" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.977968 4885 scope.go:117] "RemoveContainer" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: E0308 21:13:57.978454 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": container with ID starting with 1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0 not found: ID does not exist" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.978519 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} err="failed to get container status \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": rpc error: code = NotFound desc = could not find container \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": container with ID starting with 1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0 not found: ID does not exist" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.978560 4885 scope.go:117] "RemoveContainer" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: E0308 21:13:57.978992 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": container with ID starting with 321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097 not found: ID does not exist" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.979039 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} err="failed to get container status \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": rpc error: code = NotFound desc = could not find container \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": container with ID starting with 321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097 not found: ID does not exist" Mar 08 21:13:59 crc kubenswrapper[4885]: I0308 21:13:59.396726 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" path="/var/lib/kubelet/pods/4d4063e1-1725-43e5-bf87-422d8e4a0e5b/volumes" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.142316 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:00 crc kubenswrapper[4885]: E0308 21:14:00.143094 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143127 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: E0308 21:14:00.143165 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143529 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143585 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.144707 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.147010 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.147263 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.150630 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.151316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.193464 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.296287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.315247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.468481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.994770 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:01 crc kubenswrapper[4885]: W0308 21:14:01.005646 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0dd2e1_2283_49bf_b5d1_deb889245d93.slice/crio-13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5 WatchSource:0}: Error finding container 13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5: Status 404 returned error can't find the container with id 13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5 Mar 08 21:14:01 crc kubenswrapper[4885]: I0308 21:14:01.789226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerStarted","Data":"13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5"} Mar 08 21:14:02 crc kubenswrapper[4885]: I0308 21:14:02.804662 4885 generic.go:334] "Generic (PLEG): container finished" podID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerID="49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598" exitCode=0 Mar 08 21:14:02 crc kubenswrapper[4885]: I0308 21:14:02.804735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerDied","Data":"49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598"} Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.240973 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.401750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.409783 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg" (OuterVolumeSpecName: "kube-api-access-n7mwg") pod "3a0dd2e1-2283-49bf-b5d1-deb889245d93" (UID: "3a0dd2e1-2283-49bf-b5d1-deb889245d93"). InnerVolumeSpecName "kube-api-access-n7mwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.504871 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerDied","Data":"13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5"} Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830784 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830381 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.370878 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:05 crc kubenswrapper[4885]: E0308 21:14:05.371167 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.403718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.421064 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.389555 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" path="/var/lib/kubelet/pods/2a1f5d5d-f061-4187-a9ed-720b291774e5/volumes" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.490145 4885 scope.go:117] "RemoveContainer" containerID="dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.521114 4885 scope.go:117] "RemoveContainer" containerID="238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.589711 4885 scope.go:117] "RemoveContainer" containerID="dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.661203 4885 scope.go:117] "RemoveContainer" containerID="a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.893154 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:14 crc kubenswrapper[4885]: E0308 21:14:14.894088 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.894104 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.894297 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.895736 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.898080 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.925337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.063415 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.064011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.064125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.165900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.165967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166019 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166575 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.185566 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.227668 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.550281 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.974682 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerStarted","Data":"64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d"} Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.975414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerStarted","Data":"0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9"} Mar 08 21:14:17 crc kubenswrapper[4885]: I0308 21:14:17.006796 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d" exitCode=0 Mar 08 21:14:17 crc kubenswrapper[4885]: I0308 21:14:17.007466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d"} Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.054368 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.067579 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.081385 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.089876 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.067759 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="1c3cdaf8a46d6162f9d363ef6cea18ba7753346787d097ff49ef105b81ca0a44" exitCode=0 Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.068244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"1c3cdaf8a46d6162f9d363ef6cea18ba7753346787d097ff49ef105b81ca0a44"} Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.380804 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" path="/var/lib/kubelet/pods/14dd4829-951f-4e19-885f-f466dcbf9d1b/volumes" Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.382286 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" path="/var/lib/kubelet/pods/8664af8f-0cf2-4ef8-a701-adbaba058240/volumes" Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.085685 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="04dca4eea9d10aed627c2799e958e63d43e3b8b9874d48f81f61e7ac77836dac" exitCode=0 Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.085993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"04dca4eea9d10aed627c2799e958e63d43e3b8b9874d48f81f61e7ac77836dac"} Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.368129 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:20 crc kubenswrapper[4885]: E0308 21:14:20.368379 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.537051 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617254 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.618687 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle" (OuterVolumeSpecName: "bundle") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.631864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util" (OuterVolumeSpecName: "util") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.706085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2" (OuterVolumeSpecName: "kube-api-access-hmkf2") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "kube-api-access-hmkf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720452 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720485 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720498 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.117832 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9"} Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.117905 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9" Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.118058 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:24 crc kubenswrapper[4885]: I0308 21:14:24.051623 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:14:24 crc kubenswrapper[4885]: I0308 21:14:24.064902 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:14:25 crc kubenswrapper[4885]: I0308 21:14:25.387432 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" path="/var/lib/kubelet/pods/1aad146d-597d-436f-ba72-59a57f223ad0/volumes" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.740646 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741420 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741432 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741450 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="util" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741457 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="util" Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741479 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="pull" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741485 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="pull" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741679 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.742320 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.744047 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.745112 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.745283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x4rm8" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.755717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.854598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.859618 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.860782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.863283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4xjdl" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.878635 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.883066 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.884236 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.892982 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.940019 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958682 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.019849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.034837 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.037099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.040537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.040738 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-22zwk" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.047632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062470 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.066425 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.067859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.069446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.077439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.086548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.101297 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.102564 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.106748 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6lrsm" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.140870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.168202 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.168254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.176857 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.214118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.277861 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.302525 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.375136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.375575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.376479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.397553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.498571 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.560123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: W0308 21:14:33.686826 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9864aac_5821_4f9b_bcc8_f07752f987b7.slice/crio-f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6 WatchSource:0}: Error finding container f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6: Status 404 returned error can't find the container with id f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6 Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.691144 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.692697 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.796805 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:33 crc kubenswrapper[4885]: W0308 21:14:33.827633 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe4d43f_e037_431e_98e3_d50194963def.slice/crio-d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180 WatchSource:0}: Error finding container d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180: Status 404 returned error can't find the container with id d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180 Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.876172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.072292 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:34 crc kubenswrapper[4885]: W0308 21:14:34.087853 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482d7874_16e6_4043_95b1_59222dab9edc.slice/crio-eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6 WatchSource:0}: Error finding container eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6: Status 404 returned error can't find the container with id eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6 Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.212314 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.226699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" event={"ID":"482d7874-16e6-4043-95b1-59222dab9edc","Type":"ContainerStarted","Data":"eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.228800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" event={"ID":"0fe4d43f-e037-431e-98e3-d50194963def","Type":"ContainerStarted","Data":"d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.229776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" event={"ID":"062a5ba6-b2c8-4b0c-95e1-d51c1196f367","Type":"ContainerStarted","Data":"391898f0dd9350cc4c92a0e882e4ded906044960b83c0c12659555eb5adaf87b"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.231178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" event={"ID":"65ea3078-ccec-4913-9ce0-873ad93efd0e","Type":"ContainerStarted","Data":"a9102119a5c4610006f8fd7fd0d3a6dc25ed7d1a481e874c246b07c16df5b041"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.232227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" event={"ID":"c9864aac-5821-4f9b-bcc8-f07752f987b7","Type":"ContainerStarted","Data":"f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6"} Mar 08 21:14:35 crc kubenswrapper[4885]: I0308 21:14:35.368101 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:35 crc kubenswrapper[4885]: E0308 21:14:35.368647 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.419451 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.424552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.438231 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569905 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569942 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673411 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673579 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.674012 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.674262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.696788 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.795818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.394974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" event={"ID":"0fe4d43f-e037-431e-98e3-d50194963def","Type":"ContainerStarted","Data":"0bec2e78a1f6db95fc14e9c10e3cfa2221801ab222a50e6e6e8b4165bc3f1580"} Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.457651 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" podStartSLOduration=2.42315509 podStartE2EDuration="14.457633431s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.83178195 +0000 UTC m=+6175.227835973" lastFinishedPulling="2026-03-08 21:14:45.866260281 +0000 UTC m=+6187.262314314" observedRunningTime="2026-03-08 21:14:46.414960818 +0000 UTC m=+6187.811014841" watchObservedRunningTime="2026-03-08 21:14:46.457633431 +0000 UTC m=+6187.853687454" Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.459650 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:46 crc kubenswrapper[4885]: W0308 21:14:46.487413 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613a3b7b_ebce_483d_b67d_3c9310c4604d.slice/crio-959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba WatchSource:0}: Error finding container 959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba: Status 404 returned error can't find the container with id 959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.413909 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" event={"ID":"482d7874-16e6-4043-95b1-59222dab9edc","Type":"ContainerStarted","Data":"dc346109ab017eea4cfc43b0424b8670bd1c301c50a94d88baf86363cd4813ce"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.415303 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.416980 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" exitCode=0 Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.417108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.417184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.421076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" event={"ID":"062a5ba6-b2c8-4b0c-95e1-d51c1196f367","Type":"ContainerStarted","Data":"356632c80c5662f9b6d1f23a5fc866ba11029da9ffb54c4b6abae369029cb11c"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.421869 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.426490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" event={"ID":"65ea3078-ccec-4913-9ce0-873ad93efd0e","Type":"ContainerStarted","Data":"fc0c2487f4508b7df7ce3bc93c23366165a739b59177f333ee66b4ab6654a443"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.428412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" event={"ID":"c9864aac-5821-4f9b-bcc8-f07752f987b7","Type":"ContainerStarted","Data":"3056034978a15a214f614d3a7a9485587e77c48288f9e4fa8470247f9d4f48d9"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.445254 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" podStartSLOduration=3.597096586 podStartE2EDuration="15.445236793s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:34.09109461 +0000 UTC m=+6175.487148633" lastFinishedPulling="2026-03-08 21:14:45.939234817 +0000 UTC m=+6187.335288840" observedRunningTime="2026-03-08 21:14:47.438635265 +0000 UTC m=+6188.834689288" watchObservedRunningTime="2026-03-08 21:14:47.445236793 +0000 UTC m=+6188.841290816" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.461603 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.462804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" podStartSLOduration=3.459478337 podStartE2EDuration="15.462786263s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.880161056 +0000 UTC m=+6175.276215079" lastFinishedPulling="2026-03-08 21:14:45.883468982 +0000 UTC m=+6187.279523005" observedRunningTime="2026-03-08 21:14:47.456159055 +0000 UTC m=+6188.852213078" watchObservedRunningTime="2026-03-08 21:14:47.462786263 +0000 UTC m=+6188.858840286" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.490136 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" podStartSLOduration=3.296816267 podStartE2EDuration="15.490115856s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.692458745 +0000 UTC m=+6175.088512768" lastFinishedPulling="2026-03-08 21:14:45.885758334 +0000 UTC m=+6187.281812357" observedRunningTime="2026-03-08 21:14:47.486350214 +0000 UTC m=+6188.882404237" watchObservedRunningTime="2026-03-08 21:14:47.490115856 +0000 UTC m=+6188.886169889" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.550232 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" podStartSLOduration=2.879088644 podStartE2EDuration="14.550209056s" podCreationTimestamp="2026-03-08 21:14:33 +0000 UTC" firstStartedPulling="2026-03-08 21:14:34.213133871 +0000 UTC m=+6175.609187894" lastFinishedPulling="2026-03-08 21:14:45.884254263 +0000 UTC m=+6187.280308306" observedRunningTime="2026-03-08 21:14:47.544005649 +0000 UTC m=+6188.940059672" watchObservedRunningTime="2026-03-08 21:14:47.550209056 +0000 UTC m=+6188.946263089" Mar 08 21:14:48 crc kubenswrapper[4885]: I0308 21:14:48.368140 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:48 crc kubenswrapper[4885]: E0308 21:14:48.368695 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:48 crc kubenswrapper[4885]: I0308 21:14:48.439524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} Mar 08 21:14:50 crc kubenswrapper[4885]: I0308 21:14:50.460616 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" exitCode=0 Mar 08 21:14:50 crc kubenswrapper[4885]: I0308 21:14:50.460709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} Mar 08 21:14:51 crc kubenswrapper[4885]: I0308 21:14:51.473970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} Mar 08 21:14:51 crc kubenswrapper[4885]: I0308 21:14:51.505952 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fk9ll" podStartSLOduration=8.032965296 podStartE2EDuration="11.50590822s" podCreationTimestamp="2026-03-08 21:14:40 +0000 UTC" firstStartedPulling="2026-03-08 21:14:47.418946298 +0000 UTC m=+6188.815000321" lastFinishedPulling="2026-03-08 21:14:50.891889222 +0000 UTC m=+6192.287943245" observedRunningTime="2026-03-08 21:14:51.493386595 +0000 UTC m=+6192.889440628" watchObservedRunningTime="2026-03-08 21:14:51.50590822 +0000 UTC m=+6192.901962263" Mar 08 21:14:53 crc kubenswrapper[4885]: I0308 21:14:53.564138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.163573 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.164172 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" containerID="cri-o://974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" gracePeriod=2 Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.195595 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.243860 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: E0308 21:14:56.244382 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.244395 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.244574 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.245308 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.257308 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f7e4501e-3805-4590-b759-f520d3f98787" podUID="beb866d8-13cb-4dd6-9ce8-a2dad0935453" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.270242 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335049 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436600 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436655 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436711 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.437497 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.444383 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.466817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.507880 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.509066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.520429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mbf6w" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.534159 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.586551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.651389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.753567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.791844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.848638 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.454743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.504164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.546092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"beb866d8-13cb-4dd6-9ce8-a2dad0935453","Type":"ContainerStarted","Data":"e6e0d10d5fcd8acc86c54d3629931fc145e0cce2b5b8a547512bea2f630148dd"} Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.547730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d54c8104-6382-4373-a672-8e2ac804ebba","Type":"ContainerStarted","Data":"911bb8f5c9d1ee0e418d9cedc73f66b111248dddf43de71e7743d3b3f5fef206"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.559113 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7e4501e-3805-4590-b759-f520d3f98787" containerID="974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" exitCode=137 Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.562645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d54c8104-6382-4373-a672-8e2ac804ebba","Type":"ContainerStarted","Data":"b8672606b0c2bd7251a86f9aab2de4d6445651aaddfe82a993d6a5b63cc19382"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.563843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.565439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"beb866d8-13cb-4dd6-9ce8-a2dad0935453","Type":"ContainerStarted","Data":"c790888d5b3e8f9b62707fb63c279939546f530dd9f1a42ab63cc7bd52c72d51"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.644758 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.644742883 podStartE2EDuration="2.644742883s" podCreationTimestamp="2026-03-08 21:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:14:58.638222268 +0000 UTC m=+6200.034276291" watchObservedRunningTime="2026-03-08 21:14:58.644742883 +0000 UTC m=+6200.040796906" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.648484 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.21186234 podStartE2EDuration="2.648464162s" podCreationTimestamp="2026-03-08 21:14:56 +0000 UTC" firstStartedPulling="2026-03-08 21:14:57.479637724 +0000 UTC m=+6198.875691747" lastFinishedPulling="2026-03-08 21:14:57.916239546 +0000 UTC m=+6199.312293569" observedRunningTime="2026-03-08 21:14:58.60320885 +0000 UTC m=+6199.999262873" watchObservedRunningTime="2026-03-08 21:14:58.648464162 +0000 UTC m=+6200.044518185" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.786214 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.795283 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.797268 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.805404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.805840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.806768 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.807061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-bxpws" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.807338 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.863058 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.917841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.920464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.921184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.970850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.970934 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971251 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971633 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf" (OuterVolumeSpecName: "kube-api-access-v2tgf") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "kube-api-access-v2tgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.039102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.074021 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.074033 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.075686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.079898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.080569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.081163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.082219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.085440 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.095547 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.096760 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.175411 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.273739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.405157 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e4501e-3805-4590-b759-f520d3f98787" path="/var/lib/kubelet/pods/f7e4501e-3805-4590-b759-f520d3f98787/volumes" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.431165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.438459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443305 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443451 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-v95cc" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443526 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443625 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443642 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443318 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443820 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.454987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.573559 4885 scope.go:117] "RemoveContainer" containerID="974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.573732 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.587726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.590002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606388 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606497 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606666 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608048 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710719 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711129 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711710 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.720970 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.723311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.735755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.737272 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.737310 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66297ffd1e0f2d6f527b2446228b4ca8e7c611bc8c1afd6b5737c292872e8be2/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.748340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.752360 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.753413 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.800050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.813098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.828529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.138740 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.140710 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.143307 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.143517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.152337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.291385 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:15:00 crc kubenswrapper[4885]: W0308 21:15:00.293191 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd2a302_0f57_40e0_9a28_0a1cdfabfc5e.slice/crio-59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89 WatchSource:0}: Error finding container 59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89: Status 404 returned error can't find the container with id 59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89 Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324965 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.427208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.432905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.446611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.471877 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.586024 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"e2bdefa7a0d169d5f24846d48c1cd06ee1ecc3894618d5040aeab1d1cc77ffad"} Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.588005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89"} Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.802074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.803238 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.866019 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.025225 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:01 crc kubenswrapper[4885]: W0308 21:15:01.027624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dec2e5_804e_4bc5_99cc_370c31d352e0.slice/crio-aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7 WatchSource:0}: Error finding container aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7: Status 404 returned error can't find the container with id aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7 Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.597104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerStarted","Data":"b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed"} Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.597397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerStarted","Data":"aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7"} Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.632812 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" podStartSLOduration=1.6327943120000001 podStartE2EDuration="1.632794312s" podCreationTimestamp="2026-03-08 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:15:01.621150429 +0000 UTC m=+6203.017204442" watchObservedRunningTime="2026-03-08 21:15:01.632794312 +0000 UTC m=+6203.028848335" Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.683288 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.215517 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.368644 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:02 crc kubenswrapper[4885]: E0308 21:15:02.369212 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.620538 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerID="b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed" exitCode=0 Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.621913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerDied","Data":"b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed"} Mar 08 21:15:03 crc kubenswrapper[4885]: I0308 21:15:03.631294 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fk9ll" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" containerID="cri-o://0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" gracePeriod=2 Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.247318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.253401 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349774 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349808 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349868 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349998 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.350725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr" (OuterVolumeSpecName: "kube-api-access-ff6jr") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "kube-api-access-ff6jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358396 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz" (OuterVolumeSpecName: "kube-api-access-6bwkz") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "kube-api-access-6bwkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.366568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities" (OuterVolumeSpecName: "utilities") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.396005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453410 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453444 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453458 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453471 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453483 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453494 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645440 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" exitCode=0 Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645653 4885 scope.go:117] "RemoveContainer" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645860 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656664 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerDied","Data":"aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656699 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656805 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.691212 4885 scope.go:117] "RemoveContainer" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.718598 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.720395 4885 scope.go:117] "RemoveContainer" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.731978 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739103 4885 scope.go:117] "RemoveContainer" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.739449 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": container with ID starting with 0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5 not found: ID does not exist" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739482 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} err="failed to get container status \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": rpc error: code = NotFound desc = could not find container \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": container with ID starting with 0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739501 4885 scope.go:117] "RemoveContainer" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.740100 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": container with ID starting with cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1 not found: ID does not exist" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740158 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} err="failed to get container status \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": rpc error: code = NotFound desc = could not find container \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": container with ID starting with cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740200 4885 scope.go:117] "RemoveContainer" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.740503 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": container with ID starting with 6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4 not found: ID does not exist" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740526 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4"} err="failed to get container status \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": rpc error: code = NotFound desc = could not find container \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": container with ID starting with 6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.749430 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:05 crc kubenswrapper[4885]: I0308 21:15:05.388820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" path="/var/lib/kubelet/pods/613a3b7b-ebce-483d-b67d-3c9310c4604d/volumes" Mar 08 21:15:05 crc kubenswrapper[4885]: I0308 21:15:05.390221 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" path="/var/lib/kubelet/pods/c873212d-4c8c-4d2c-ad89-be5ff96db764/volumes" Mar 08 21:15:06 crc kubenswrapper[4885]: I0308 21:15:06.689560 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9"} Mar 08 21:15:06 crc kubenswrapper[4885]: I0308 21:15:06.857053 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.713790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4"} Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.798046 4885 scope.go:117] "RemoveContainer" containerID="ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.849238 4885 scope.go:117] "RemoveContainer" containerID="5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.899040 4885 scope.go:117] "RemoveContainer" containerID="38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.936072 4885 scope.go:117] "RemoveContainer" containerID="ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad" Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.369359 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:14 crc kubenswrapper[4885]: E0308 21:15:14.370736 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.800666 4885 generic.go:334] "Generic (PLEG): container finished" podID="55b083d5-789c-424a-8e11-f5e2e4bc51b0" containerID="274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9" exitCode=0 Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.800884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerDied","Data":"274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9"} Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.803368 4885 generic.go:334] "Generic (PLEG): container finished" podID="efd2a302-0f57-40e0-9a28-0a1cdfabfc5e" containerID="4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4" exitCode=0 Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.803393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerDied","Data":"4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4"} Mar 08 21:15:14 crc kubenswrapper[4885]: E0308 21:15:14.862468 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b083d5_789c_424a_8e11_f5e2e4bc51b0.slice/crio-274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b083d5_789c_424a_8e11_f5e2e4bc51b0.slice/crio-conmon-274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:15:18 crc kubenswrapper[4885]: I0308 21:15:18.845108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"657c612348a2f491e6f75e8e6c3ff9ef83c529e0953572ff0f884efe8b814589"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.936948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"7a6f7c64ec98c538a7b7d12657e48310811462987ecb9920e1e0dc0ec259cf52"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.937708 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.939291 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"5e24498f6f6c1be3bbf0a8e4bf7e7e4c681dc4e986a6d227ba792cd00cbe832f"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.943193 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.965118 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.086727298 podStartE2EDuration="24.965085713s" podCreationTimestamp="2026-03-08 21:14:58 +0000 UTC" firstStartedPulling="2026-03-08 21:14:59.851844807 +0000 UTC m=+6201.247898830" lastFinishedPulling="2026-03-08 21:15:17.730203222 +0000 UTC m=+6219.126257245" observedRunningTime="2026-03-08 21:15:22.9586393 +0000 UTC m=+6224.354693363" watchObservedRunningTime="2026-03-08 21:15:22.965085713 +0000 UTC m=+6224.361139766" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.071225 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.086677 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.098601 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.113116 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.121169 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.128578 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.136940 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.147704 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.158289 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.166805 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.174153 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.185891 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.383306 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" path="/var/lib/kubelet/pods/636cf333-497f-4fcf-9d2d-ebfe48c81d75/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.383903 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" path="/var/lib/kubelet/pods/64ea00b6-97bd-459b-ad43-bbfc5862cc4c/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.384536 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869febc8-e7d9-4723-bc87-567e08849a27" path="/var/lib/kubelet/pods/869febc8-e7d9-4723-bc87-567e08849a27/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.385226 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" path="/var/lib/kubelet/pods/9a6e4793-0be0-4d9f-b96a-c8877648415e/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.386452 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" path="/var/lib/kubelet/pods/b7b81f14-560e-4a64-88c7-164fbb0b4f8b/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.387189 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" path="/var/lib/kubelet/pods/bafc9a8b-2cbe-465d-8055-e6c2675b80a4/volumes" Mar 08 21:15:26 crc kubenswrapper[4885]: I0308 21:15:26.368956 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:26 crc kubenswrapper[4885]: E0308 21:15:26.369746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:26 crc kubenswrapper[4885]: I0308 21:15:26.992812 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"53a623f6ce2d7149679dbc2084be8deda15fe862e5a06fd0f00bcfae38424432"} Mar 08 21:15:31 crc kubenswrapper[4885]: I0308 21:15:31.040697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"e80b916b9d03c9c88aa8821bf0a20149e5a099858202885cabc9651d7f25da60"} Mar 08 21:15:31 crc kubenswrapper[4885]: I0308 21:15:31.094891 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.246340173 podStartE2EDuration="33.094852905s" podCreationTimestamp="2026-03-08 21:14:58 +0000 UTC" firstStartedPulling="2026-03-08 21:15:00.295266292 +0000 UTC m=+6201.691320305" lastFinishedPulling="2026-03-08 21:15:30.143779014 +0000 UTC m=+6231.539833037" observedRunningTime="2026-03-08 21:15:31.084076026 +0000 UTC m=+6232.480130089" watchObservedRunningTime="2026-03-08 21:15:31.094852905 +0000 UTC m=+6232.490906978" Mar 08 21:15:34 crc kubenswrapper[4885]: I0308 21:15:34.814403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.064321 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.103849 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.387998 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" path="/var/lib/kubelet/pods/b06fac1b-774d-4b4d-afd9-58024d9e5903/volumes" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.452662 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453564 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453602 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-utilities" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453610 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-utilities" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453635 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-content" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453643 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-content" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453656 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453949 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453968 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.456407 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.461486 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.461686 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.462451 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.651382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652266 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652508 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652616 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652773 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756256 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756336 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.765859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.766897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.767397 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.769913 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.770385 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.780373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.794039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:39 crc kubenswrapper[4885]: I0308 21:15:39.079154 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:15:42 crc kubenswrapper[4885]: I0308 21:15:41.369555 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:42 crc kubenswrapper[4885]: E0308 21:15:41.370225 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:43 crc kubenswrapper[4885]: I0308 21:15:43.023319 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:43 crc kubenswrapper[4885]: W0308 21:15:43.042875 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b866cf8_8618_4c89_baa5_b47d10251b3a.slice/crio-6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a WatchSource:0}: Error finding container 6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a: Status 404 returned error can't find the container with id 6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a Mar 08 21:15:43 crc kubenswrapper[4885]: I0308 21:15:43.231367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a"} Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.242122 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.814611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.816863 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:45 crc kubenswrapper[4885]: I0308 21:15:45.257740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} Mar 08 21:15:45 crc kubenswrapper[4885]: I0308 21:15:45.258408 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:46 crc kubenswrapper[4885]: I0308 21:15:46.268514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.290465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.293392 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.323118 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.81003254 podStartE2EDuration="10.323096694s" podCreationTimestamp="2026-03-08 21:15:38 +0000 UTC" firstStartedPulling="2026-03-08 21:15:43.050846612 +0000 UTC m=+6244.446900645" lastFinishedPulling="2026-03-08 21:15:47.563910756 +0000 UTC m=+6248.959964799" observedRunningTime="2026-03-08 21:15:48.314815492 +0000 UTC m=+6249.710869515" watchObservedRunningTime="2026-03-08 21:15:48.323096694 +0000 UTC m=+6249.719150727" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.368203 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:52 crc kubenswrapper[4885]: E0308 21:15:52.368983 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.480199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.482327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.496004 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.584025 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.585475 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.592493 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.594606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.616576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.616719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719975 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.735877 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.799975 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.821268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.821529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.822735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.844242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.903990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.044764 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.058862 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.378015 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" path="/var/lib/kubelet/pods/eed9605f-3b77-4800-9534-6d8f2654f392/volumes" Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.494218 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.515846 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:53 crc kubenswrapper[4885]: W0308 21:15:53.519807 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92cde86b_0d50_444d_b116_e32fbf5004f9.slice/crio-04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba WatchSource:0}: Error finding container 04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba: Status 404 returned error can't find the container with id 04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba Mar 08 21:15:53 crc kubenswrapper[4885]: W0308 21:15:53.534113 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb454a1c4_958a_40a9_8c50_9154281574fd.slice/crio-c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104 WatchSource:0}: Error finding container c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104: Status 404 returned error can't find the container with id c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.029031 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.038832 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356257 4885 generic.go:334] "Generic (PLEG): container finished" podID="b454a1c4-958a-40a9-8c50-9154281574fd" containerID="a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb" exitCode=0 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerDied","Data":"a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerStarted","Data":"c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358036 4885 generic.go:334] "Generic (PLEG): container finished" podID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerID="870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e" exitCode=0 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerDied","Data":"870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358077 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerStarted","Data":"04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba"} Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.391875 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" path="/var/lib/kubelet/pods/a9535a5b-072e-4a1f-b9e4-89942ba9e800/volumes" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.857345 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.866250 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"92cde86b-0d50-444d-b116-e32fbf5004f9\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990816 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"92cde86b-0d50-444d-b116-e32fbf5004f9\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"b454a1c4-958a-40a9-8c50-9154281574fd\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991085 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"b454a1c4-958a-40a9-8c50-9154281574fd\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b454a1c4-958a-40a9-8c50-9154281574fd" (UID: "b454a1c4-958a-40a9-8c50-9154281574fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991706 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92cde86b-0d50-444d-b116-e32fbf5004f9" (UID: "92cde86b-0d50-444d-b116-e32fbf5004f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.003595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7" (OuterVolumeSpecName: "kube-api-access-2x7v7") pod "92cde86b-0d50-444d-b116-e32fbf5004f9" (UID: "92cde86b-0d50-444d-b116-e32fbf5004f9"). InnerVolumeSpecName "kube-api-access-2x7v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.004678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck" (OuterVolumeSpecName: "kube-api-access-lpmck") pod "b454a1c4-958a-40a9-8c50-9154281574fd" (UID: "b454a1c4-958a-40a9-8c50-9154281574fd"). InnerVolumeSpecName "kube-api-access-lpmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093844 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093879 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093895 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093935 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerDied","Data":"c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104"} Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384326 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384346 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerDied","Data":"04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba"} Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387688 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387764 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.048960 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:58 crc kubenswrapper[4885]: E0308 21:15:58.049634 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049648 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: E0308 21:15:58.049673 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049681 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049941 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049974 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.050679 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.053448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.053800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.054127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.054276 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sjjm8" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.064906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.178675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.288290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.297861 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.300580 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.301901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.376470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.945827 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:59 crc kubenswrapper[4885]: I0308 21:15:59.438510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerStarted","Data":"379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9"} Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.140252 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.143006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.146387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.146771 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.147165 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.150203 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.219518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.322861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.350052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.470305 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.934992 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:01 crc kubenswrapper[4885]: I0308 21:16:01.463273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerStarted","Data":"ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2"} Mar 08 21:16:04 crc kubenswrapper[4885]: I0308 21:16:04.370171 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:04 crc kubenswrapper[4885]: E0308 21:16:04.371340 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.517424 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerStarted","Data":"50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393"} Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.521645 4885 generic.go:334] "Generic (PLEG): container finished" podID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerID="e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90" exitCode=0 Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.521687 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerDied","Data":"e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90"} Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.546428 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-hj6ng" podStartSLOduration=2.065984669 podStartE2EDuration="7.546405771s" podCreationTimestamp="2026-03-08 21:15:58 +0000 UTC" firstStartedPulling="2026-03-08 21:15:58.965902455 +0000 UTC m=+6260.361956478" lastFinishedPulling="2026-03-08 21:16:04.446323547 +0000 UTC m=+6265.842377580" observedRunningTime="2026-03-08 21:16:05.538654244 +0000 UTC m=+6266.934708277" watchObservedRunningTime="2026-03-08 21:16:05.546405771 +0000 UTC m=+6266.942459794" Mar 08 21:16:06 crc kubenswrapper[4885]: I0308 21:16:06.936888 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.118221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"6963ac4b-0b7b-489f-a98a-7bad7270d510\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.127244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m" (OuterVolumeSpecName: "kube-api-access-xdd2m") pod "6963ac4b-0b7b-489f-a98a-7bad7270d510" (UID: "6963ac4b-0b7b-489f-a98a-7bad7270d510"). InnerVolumeSpecName "kube-api-access-xdd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.220499 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.552014 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerID="50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393" exitCode=0 Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.552124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerDied","Data":"50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393"} Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556506 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerDied","Data":"ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2"} Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556558 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556565 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.041273 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.054026 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.063811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.073432 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.153262 4885 scope.go:117] "RemoveContainer" containerID="3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.179479 4885 scope.go:117] "RemoveContainer" containerID="85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.233410 4885 scope.go:117] "RemoveContainer" containerID="57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.267044 4885 scope.go:117] "RemoveContainer" containerID="8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.311223 4885 scope.go:117] "RemoveContainer" containerID="293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.357535 4885 scope.go:117] "RemoveContainer" containerID="9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.397324 4885 scope.go:117] "RemoveContainer" containerID="085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.432865 4885 scope.go:117] "RemoveContainer" containerID="08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.458707 4885 scope.go:117] "RemoveContainer" containerID="da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.490655 4885 scope.go:117] "RemoveContainer" containerID="c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.520578 4885 scope.go:117] "RemoveContainer" containerID="2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.542610 4885 scope.go:117] "RemoveContainer" containerID="bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.940076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.060759 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061215 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061397 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.066444 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts" (OuterVolumeSpecName: "scripts") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.066469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph" (OuterVolumeSpecName: "kube-api-access-8v2ph") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "kube-api-access-8v2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.089241 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.092089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data" (OuterVolumeSpecName: "config-data") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.092795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163628 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163664 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163677 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163690 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.382383 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" path="/var/lib/kubelet/pods/3405968f-173e-4ab2-a8ac-699fdaaad4d3/volumes" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.383062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" path="/var/lib/kubelet/pods/4497ae4b-d188-4afa-9546-11fbe209a9a7/volumes" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626290 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerDied","Data":"379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9"} Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626342 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.248009 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:13 crc kubenswrapper[4885]: E0308 21:16:13.249290 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249313 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: E0308 21:16:13.249353 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249364 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249718 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249740 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.253002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.261965 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.262156 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.262722 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.269892 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sjjm8" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356126 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356489 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459381 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.465098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.468279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.470556 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.483863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.578084 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 08 21:16:14 crc kubenswrapper[4885]: I0308 21:16:14.059503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:14 crc kubenswrapper[4885]: W0308 21:16:14.062877 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737065cc_3153_4e0c_b4ee_4ad587c8d494.slice/crio-b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97 WatchSource:0}: Error finding container b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97: Status 404 returned error can't find the container with id b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97 Mar 08 21:16:14 crc kubenswrapper[4885]: I0308 21:16:14.714658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.435943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" containerID="cri-o://b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436661 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" containerID="cri-o://6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436685 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" containerID="cri-o://92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436807 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" containerID="cri-o://a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726557 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" exitCode=0 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726588 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" exitCode=2 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.728480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"9e724998056529f62b65f6d60184fb5702abd86897e3dd367c0e895004fbb4ff"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.302750 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.368672 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.369052 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432146 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432250 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432333 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432518 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.433546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.434796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.438746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts" (OuterVolumeSpecName: "scripts") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.441357 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8" (OuterVolumeSpecName: "kube-api-access-ls9p8") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "kube-api-access-ls9p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.463536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.518364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535873 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535907 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535935 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535949 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535958 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535967 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.542812 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data" (OuterVolumeSpecName: "config-data") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.638416 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743640 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" exitCode=0 Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743684 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" exitCode=0 Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743707 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743751 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743769 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.744126 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.790411 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.790542 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.800265 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.823802 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824407 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824425 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824444 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824453 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824469 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824478 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824499 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824506 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824780 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824804 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.827579 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.838096 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.838233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.844461 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.844497 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.881613 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.921915 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.922259 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} err="failed to get container status \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922332 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.922650 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922684 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} err="failed to get container status \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922704 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.923048 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923076 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} err="failed to get container status \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923093 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.923280 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923304 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} err="failed to get container status \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923319 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923511 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} err="failed to get container status \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923529 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923694 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} err="failed to get container status \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923711 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923958 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} err="failed to get container status \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923977 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.924147 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} err="failed to get container status \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.943902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944037 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944124 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046172 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.047270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.050267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.052082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.059628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.061388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.079448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.166115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.393162 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" path="/var/lib/kubelet/pods/4b866cf8-8618-4c89-baa5-b47d10251b3a/volumes" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.740234 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.755899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"10f08b0d4e88a44af4a08cc4e989bead8ffc2d60df2b2606efa2e0ed7f81eb68"} Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.759419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"bac3d915e9924359acad5bb536ee4f2c4cabb5219af0b97ff034cdebebc70c73"} Mar 08 21:16:18 crc kubenswrapper[4885]: I0308 21:16:18.768380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19"} Mar 08 21:16:19 crc kubenswrapper[4885]: I0308 21:16:19.793989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22"} Mar 08 21:16:20 crc kubenswrapper[4885]: I0308 21:16:20.807014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"0dad9935e459a0b4b052bd27c236f7caa734323487a03d6a4ebb135ad0a89581"} Mar 08 21:16:20 crc kubenswrapper[4885]: I0308 21:16:20.809009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e"} Mar 08 21:16:22 crc kubenswrapper[4885]: I0308 21:16:22.831497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"2d8e5c5442bec2dbdb49d12f71bc9d1b148f5f828eb04262ea4f52b92d8807ed"} Mar 08 21:16:22 crc kubenswrapper[4885]: I0308 21:16:22.871092 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.466749646 podStartE2EDuration="9.871069451s" podCreationTimestamp="2026-03-08 21:16:13 +0000 UTC" firstStartedPulling="2026-03-08 21:16:14.066104917 +0000 UTC m=+6275.462158940" lastFinishedPulling="2026-03-08 21:16:22.470424712 +0000 UTC m=+6283.866478745" observedRunningTime="2026-03-08 21:16:22.867152336 +0000 UTC m=+6284.263206369" watchObservedRunningTime="2026-03-08 21:16:22.871069451 +0000 UTC m=+6284.267123474" Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.843949 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980"} Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.845205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.869056 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.734823961 podStartE2EDuration="7.869034252s" podCreationTimestamp="2026-03-08 21:16:16 +0000 UTC" firstStartedPulling="2026-03-08 21:16:17.742644295 +0000 UTC m=+6279.138698308" lastFinishedPulling="2026-03-08 21:16:22.876854566 +0000 UTC m=+6284.272908599" observedRunningTime="2026-03-08 21:16:23.86596546 +0000 UTC m=+6285.262019783" watchObservedRunningTime="2026-03-08 21:16:23.869034252 +0000 UTC m=+6285.265088275" Mar 08 21:16:28 crc kubenswrapper[4885]: I0308 21:16:28.369043 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:28 crc kubenswrapper[4885]: E0308 21:16:28.370107 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.518657 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.520620 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.537498 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.626763 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.628271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.630194 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.647431 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.658115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.658393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761346 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761564 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.762228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.798269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.840161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.867365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.867904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.868739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.885270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.946959 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.373537 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:30 crc kubenswrapper[4885]: W0308 21:16:30.382541 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451cc09f_d6aa_4930_be69_102ce5b86575.slice/crio-8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff WatchSource:0}: Error finding container 8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff: Status 404 returned error can't find the container with id 8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff Mar 08 21:16:30 crc kubenswrapper[4885]: W0308 21:16:30.562020 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod835eb61f_3559_41d5_9891_23a6ecef9ed1.slice/crio-a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6 WatchSource:0}: Error finding container a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6: Status 404 returned error can't find the container with id a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.562291 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916191 4885 generic.go:334] "Generic (PLEG): container finished" podID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerID="8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416" exitCode=0 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerDied","Data":"8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerStarted","Data":"a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918116 4885 generic.go:334] "Generic (PLEG): container finished" podID="451cc09f-d6aa-4930-be69-102ce5b86575" containerID="ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c" exitCode=0 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerDied","Data":"ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerStarted","Data":"8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.452430 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.456891 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.524965 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"835eb61f-3559-41d5-9891-23a6ecef9ed1\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"835eb61f-3559-41d5-9891-23a6ecef9ed1\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "835eb61f-3559-41d5-9891-23a6ecef9ed1" (UID: "835eb61f-3559-41d5-9891-23a6ecef9ed1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"451cc09f-d6aa-4930-be69-102ce5b86575\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"451cc09f-d6aa-4930-be69-102ce5b86575\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526258 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "451cc09f-d6aa-4930-be69-102ce5b86575" (UID: "451cc09f-d6aa-4930-be69-102ce5b86575"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526556 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526574 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.531223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw" (OuterVolumeSpecName: "kube-api-access-g6trw") pod "835eb61f-3559-41d5-9891-23a6ecef9ed1" (UID: "835eb61f-3559-41d5-9891-23a6ecef9ed1"). InnerVolumeSpecName "kube-api-access-g6trw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.531409 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5" (OuterVolumeSpecName: "kube-api-access-7ttq5") pod "451cc09f-d6aa-4930-be69-102ce5b86575" (UID: "451cc09f-d6aa-4930-be69-102ce5b86575"). InnerVolumeSpecName "kube-api-access-7ttq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.632619 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.632704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerDied","Data":"8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941431 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941254 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerDied","Data":"a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943297 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.921212 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:34 crc kubenswrapper[4885]: E0308 21:16:34.922093 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: E0308 21:16:34.922158 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922167 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922414 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922432 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.923489 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.925455 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-n7qf9" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.927053 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.933560 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992221 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992303 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094636 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.101102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.101213 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.103624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.113959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.260527 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.956808 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.993368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerStarted","Data":"bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575"} Mar 08 21:16:41 crc kubenswrapper[4885]: I0308 21:16:41.368831 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:41 crc kubenswrapper[4885]: E0308 21:16:41.369685 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:42 crc kubenswrapper[4885]: I0308 21:16:42.063679 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerStarted","Data":"ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810"} Mar 08 21:16:42 crc kubenswrapper[4885]: I0308 21:16:42.096152 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-gtv5s" podStartSLOduration=2.964253065 podStartE2EDuration="8.096125433s" podCreationTimestamp="2026-03-08 21:16:34 +0000 UTC" firstStartedPulling="2026-03-08 21:16:35.976493258 +0000 UTC m=+6297.372547301" lastFinishedPulling="2026-03-08 21:16:41.108365636 +0000 UTC m=+6302.504419669" observedRunningTime="2026-03-08 21:16:42.085245411 +0000 UTC m=+6303.481299494" watchObservedRunningTime="2026-03-08 21:16:42.096125433 +0000 UTC m=+6303.492179496" Mar 08 21:16:44 crc kubenswrapper[4885]: I0308 21:16:44.087427 4885 generic.go:334] "Generic (PLEG): container finished" podID="4393565c-775a-48fd-a497-602a556ff169" containerID="ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810" exitCode=0 Mar 08 21:16:44 crc kubenswrapper[4885]: I0308 21:16:44.087583 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerDied","Data":"ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810"} Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.688549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.745887 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746299 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.761320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929" (OuterVolumeSpecName: "kube-api-access-lb929") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "kube-api-access-lb929". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.761453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.767482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data" (OuterVolumeSpecName: "config-data") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.807223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851838 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851880 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851898 4885 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851914 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115436 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerDied","Data":"bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575"} Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115475 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115531 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: E0308 21:16:46.516715 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516733 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516943 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.518001 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.529521 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-n7qf9" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.537036 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.541412 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.542614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.561959 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.602521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.602811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603305 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.606308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.622035 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.633068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.703311 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704903 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705036 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705052 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705070 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705107 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.706350 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.706438 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.711960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.713250 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.713496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.714171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.718656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.721594 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.731644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807399 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807487 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807509 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.811021 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.811822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.813356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.814624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.815123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.834362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.834947 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.835456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.836783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.841069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.849675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911341 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911360 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911442 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911479 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911498 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.912965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.928274 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.928624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.952577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017992 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.018241 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.019362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.022171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.026531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.029503 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.032122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.041027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.265625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.452335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.533018 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.615380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:47 crc kubenswrapper[4885]: W0308 21:16:47.628602 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0142342_e857_4238_b442_8e06ceb406e1.slice/crio-e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795 WatchSource:0}: Error finding container e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795: Status 404 returned error can't find the container with id e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795 Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.752432 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145044 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerID="b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92" exitCode=0 Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerStarted","Data":"4bf88b828aac67327d42c79896ad6c9847535e8fc25a30b67b13b6ef37494170"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.148809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"5908dd7b75b520c96c07074b8cd3cf5cc6697d77c8283c37b51c7417dc33522c"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.154133 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.272081 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.181026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerStarted","Data":"4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.182497 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.188597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"4bc5e6142d8a64deed9eb23359c60a74e47cfa1b5932eeb09144f0dda2b6ebfe"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.192596 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"bf33145555bba0904a0dd4847e5d62501369bbc4ca17bc741bde452205631e9a"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.192629 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"d0e2a34e3ff9d70b1772a58693a392181e85a637a7fb9b1ec45d84c2a2c82ae1"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.210336 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c5b968869-pr98k" podStartSLOduration=3.210304277 podStartE2EDuration="3.210304277s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:16:49.201779768 +0000 UTC m=+6310.597833791" watchObservedRunningTime="2026-03-08 21:16:49.210304277 +0000 UTC m=+6310.606358290" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.210440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"1ce386361920d0c7a6f99b8b942376ded729b83cebc5715fd1be978bf894402d"} Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.211023 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.214395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"1793ffed25cc65f9adb660f214971358e5b4ada2129dd8c1de8fbfa2649c3b0e"} Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.240843 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.240823929 podStartE2EDuration="4.240823929s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:16:50.23043002 +0000 UTC m=+6311.626484043" watchObservedRunningTime="2026-03-08 21:16:50.240823929 +0000 UTC m=+6311.636877952" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.262497 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.609069605 podStartE2EDuration="4.262471389s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="2026-03-08 21:16:47.785839774 +0000 UTC m=+6309.181893807" lastFinishedPulling="2026-03-08 21:16:48.439241578 +0000 UTC m=+6309.835295591" observedRunningTime="2026-03-08 21:16:50.252361368 +0000 UTC m=+6311.648415391" watchObservedRunningTime="2026-03-08 21:16:50.262471389 +0000 UTC m=+6311.658525402" Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.047245 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.068090 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.088076 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.099698 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.373133 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:53 crc kubenswrapper[4885]: E0308 21:16:53.374219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.384122 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" path="/var/lib/kubelet/pods/3159e4ac-64da-47ba-9c70-b23214e8b8ad/volumes" Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.385787 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" path="/var/lib/kubelet/pods/96e29afc-72d1-4b29-9528-1ed61feed290/volumes" Mar 08 21:16:55 crc kubenswrapper[4885]: I0308 21:16:55.277279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"e05a3d48eeef9dbf5cb3e57980aa1eda6f610735ed34f2547835639c0a03c8a7"} Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.295513 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"0ccfb4090278efacc729d490e7c6f0021e94814713d7f2b7387c65e812f838c2"} Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.330747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.460177233 podStartE2EDuration="10.330725617s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="2026-03-08 21:16:47.63051156 +0000 UTC m=+6309.026565583" lastFinishedPulling="2026-03-08 21:16:54.501059944 +0000 UTC m=+6315.897113967" observedRunningTime="2026-03-08 21:16:56.324655904 +0000 UTC m=+6317.720709927" watchObservedRunningTime="2026-03-08 21:16:56.330725617 +0000 UTC m=+6317.726779640" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.835225 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.928735 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.954760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.043077 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.043307 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" containerID="cri-o://196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" gracePeriod=10 Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.311562 4885 generic.go:334] "Generic (PLEG): container finished" podID="938eebde-2664-4ae3-8289-e378affb1274" containerID="196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" exitCode=0 Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.311624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555"} Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.587476 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.684740 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp" (OuterVolumeSpecName: "kube-api-access-wvqpp") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "kube-api-access-wvqpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.718717 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.722663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.759226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762129 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762160 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762169 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762178 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.790617 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config" (OuterVolumeSpecName: "config") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.855691 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:57 crc kubenswrapper[4885]: E0308 21:16:57.856637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="init" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.856659 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="init" Mar 08 21:16:57 crc kubenswrapper[4885]: E0308 21:16:57.856676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.856684 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.857030 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.858727 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.864967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.865448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.865754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.866072 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.870168 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.967611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.967870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.982646 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.184827 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.334996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9"} Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.335047 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.335270 4885 scope.go:117] "RemoveContainer" containerID="196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.395294 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.413315 4885 scope.go:117] "RemoveContainer" containerID="12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.418250 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.689652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348814 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" exitCode=0 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348917 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3"} Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"423561b5afd152a5c60259d35c4878f3807b89bb6c081de0e3c7a84e7abff53a"} Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.393211 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938eebde-2664-4ae3-8289-e378affb1274" path="/var/lib/kubelet/pods/938eebde-2664-4ae3-8289-e378affb1274/volumes" Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.805987 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806293 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" containerID="cri-o://502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806733 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" containerID="cri-o://da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806797 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" containerID="cri-o://5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806839 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" containerID="cri-o://cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" gracePeriod=30 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.043452 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.054302 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.362899 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364290 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" exitCode=2 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364413 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364497 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.363035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.706715 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.731858 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732273 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.733071 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.733077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.742322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts" (OuterVolumeSpecName: "scripts") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.768133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp" (OuterVolumeSpecName: "kube-api-access-84gkp") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "kube-api-access-84gkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.782779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.821453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837328 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837360 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837370 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837380 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837388 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.875240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data" (OuterVolumeSpecName: "config-data") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.939554 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.380255 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.381427 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" path="/var/lib/kubelet/pods/6adbeb38-5e1d-43e0-a516-2cc65ad853aa/volumes" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"10f08b0d4e88a44af4a08cc4e989bead8ffc2d60df2b2606efa2e0ed7f81eb68"} Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382328 4885 scope.go:117] "RemoveContainer" containerID="da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.407318 4885 scope.go:117] "RemoveContainer" containerID="5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.432664 4885 scope.go:117] "RemoveContainer" containerID="cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.451148 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.467744 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.475816 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476380 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476424 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476434 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476454 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476462 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476495 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476503 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476780 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476806 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476824 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.479301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.481487 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.481650 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.483424 4885 scope.go:117] "RemoveContainer" containerID="502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.514718 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553546 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.655831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.657424 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661728 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661840 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.663339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.690457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.804462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:02 crc kubenswrapper[4885]: I0308 21:17:02.327247 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:02 crc kubenswrapper[4885]: W0308 21:17:02.330273 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661d1124_50bd_4ad4_95a4_ac90994383b3.slice/crio-032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c WatchSource:0}: Error finding container 032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c: Status 404 returned error can't find the container with id 032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c Mar 08 21:17:02 crc kubenswrapper[4885]: I0308 21:17:02.394343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c"} Mar 08 21:17:03 crc kubenswrapper[4885]: I0308 21:17:03.385411 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" path="/var/lib/kubelet/pods/4100eecc-ed79-4489-8a72-ba6a55eec273/volumes" Mar 08 21:17:03 crc kubenswrapper[4885]: I0308 21:17:03.407790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"6cf77d48c763816b610e164aea0d0c1598cd60036f5751f2d94ebbac3284180e"} Mar 08 21:17:04 crc kubenswrapper[4885]: I0308 21:17:04.421085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"83f40d359a5c5d9deff7036b1c4582694d09cce554a733effa13ee49e530b09c"} Mar 08 21:17:05 crc kubenswrapper[4885]: I0308 21:17:05.434235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"ea4a6acea05a037e949aedd208b75c95f102be9898efd093fbedbc89544b9c2f"} Mar 08 21:17:06 crc kubenswrapper[4885]: I0308 21:17:06.368390 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:06 crc kubenswrapper[4885]: E0308 21:17:06.369674 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:07 crc kubenswrapper[4885]: I0308 21:17:07.463105 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" exitCode=0 Mar 08 21:17:07 crc kubenswrapper[4885]: I0308 21:17:07.463190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.446436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.513060 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"8a825c377de04827a2193f45be99eefbc3269492ffad4f010adc059deda60865"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.513445 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.522239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.543778 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.275502039 podStartE2EDuration="7.543758583s" podCreationTimestamp="2026-03-08 21:17:01 +0000 UTC" firstStartedPulling="2026-03-08 21:17:02.332567815 +0000 UTC m=+6323.728621838" lastFinishedPulling="2026-03-08 21:17:07.600824359 +0000 UTC m=+6328.996878382" observedRunningTime="2026-03-08 21:17:08.533820817 +0000 UTC m=+6329.929874840" watchObservedRunningTime="2026-03-08 21:17:08.543758583 +0000 UTC m=+6329.939812606" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.559337 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qw5v" podStartSLOduration=3.044326338 podStartE2EDuration="11.55932099s" podCreationTimestamp="2026-03-08 21:16:57 +0000 UTC" firstStartedPulling="2026-03-08 21:16:59.352115254 +0000 UTC m=+6320.748169297" lastFinishedPulling="2026-03-08 21:17:07.867109926 +0000 UTC m=+6329.263163949" observedRunningTime="2026-03-08 21:17:08.548354186 +0000 UTC m=+6329.944408209" watchObservedRunningTime="2026-03-08 21:17:08.55932099 +0000 UTC m=+6329.955375013" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.677442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.831833 4885 scope.go:117] "RemoveContainer" containerID="81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.868741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.875608 4885 scope.go:117] "RemoveContainer" containerID="d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.917206 4885 scope.go:117] "RemoveContainer" containerID="8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.980872 4885 scope.go:117] "RemoveContainer" containerID="b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5" Mar 08 21:17:09 crc kubenswrapper[4885]: I0308 21:17:09.051330 4885 scope.go:117] "RemoveContainer" containerID="488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e" Mar 08 21:17:09 crc kubenswrapper[4885]: I0308 21:17:09.099105 4885 scope.go:117] "RemoveContainer" containerID="d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.184959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.185731 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.368651 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:18 crc kubenswrapper[4885]: E0308 21:17:18.369504 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:19 crc kubenswrapper[4885]: I0308 21:17:19.231806 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qw5v" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" probeResult="failure" output=< Mar 08 21:17:19 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:17:19 crc kubenswrapper[4885]: > Mar 08 21:17:28 crc kubenswrapper[4885]: I0308 21:17:28.246374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:28 crc kubenswrapper[4885]: I0308 21:17:28.301326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:29 crc kubenswrapper[4885]: I0308 21:17:29.053565 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:29 crc kubenswrapper[4885]: I0308 21:17:29.765108 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qw5v" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" containerID="cri-o://b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" gracePeriod=2 Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.369054 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.370081 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.469266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.503466 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.503956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.504118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.507399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities" (OuterVolumeSpecName: "utilities") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.509227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc" (OuterVolumeSpecName: "kube-api-access-vzkmc") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "kube-api-access-vzkmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.607439 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.607511 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.652689 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.709376 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.777877 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" exitCode=0 Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"423561b5afd152a5c60259d35c4878f3807b89bb6c081de0e3c7a84e7abff53a"} Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778254 4885 scope.go:117] "RemoveContainer" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.814476 4885 scope.go:117] "RemoveContainer" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.840401 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.856284 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.867521 4885 scope.go:117] "RemoveContainer" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933253 4885 scope.go:117] "RemoveContainer" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.933788 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": container with ID starting with b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc not found: ID does not exist" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933839 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} err="failed to get container status \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": rpc error: code = NotFound desc = could not find container \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": container with ID starting with b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc not found: ID does not exist" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933874 4885 scope.go:117] "RemoveContainer" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.934405 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": container with ID starting with 0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5 not found: ID does not exist" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934443 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} err="failed to get container status \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": rpc error: code = NotFound desc = could not find container \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": container with ID starting with 0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5 not found: ID does not exist" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934471 4885 scope.go:117] "RemoveContainer" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.934770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": container with ID starting with c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3 not found: ID does not exist" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934808 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3"} err="failed to get container status \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": rpc error: code = NotFound desc = could not find container \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": container with ID starting with c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3 not found: ID does not exist" Mar 08 21:17:31 crc kubenswrapper[4885]: I0308 21:17:31.384892 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" path="/var/lib/kubelet/pods/8c915ae7-cc70-438e-9578-6d8f75368746/volumes" Mar 08 21:17:31 crc kubenswrapper[4885]: I0308 21:17:31.821326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:17:43 crc kubenswrapper[4885]: I0308 21:17:43.369427 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:43 crc kubenswrapper[4885]: E0308 21:17:43.370530 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.288063 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289091 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-utilities" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289108 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-utilities" Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289137 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289145 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289182 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-content" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-content" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289455 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.290851 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.294364 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329326 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.352214 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432654 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.433795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.433832 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.452690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.657189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:53 crc kubenswrapper[4885]: I0308 21:17:53.185475 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.050878 4885 generic.go:334] "Generic (PLEG): container finished" podID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" exitCode=0 Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.051052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516"} Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.051279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerStarted","Data":"f44206b8ebbf0693d2d4ad48b7c945bd77de0d1345fd96143b142bf69f386619"} Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.067124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerStarted","Data":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.067403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.096831 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" podStartSLOduration=3.096809621 podStartE2EDuration="3.096809621s" podCreationTimestamp="2026-03-08 21:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:17:55.090452141 +0000 UTC m=+6376.486506174" watchObservedRunningTime="2026-03-08 21:17:55.096809621 +0000 UTC m=+6376.492863654" Mar 08 21:17:56 crc kubenswrapper[4885]: I0308 21:17:56.369113 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:56 crc kubenswrapper[4885]: E0308 21:17:56.369791 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.145251 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.147553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.149631 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.149649 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.150425 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.157733 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.228779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.332977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.354268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.469146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:01 crc kubenswrapper[4885]: I0308 21:18:01.032052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:01 crc kubenswrapper[4885]: I0308 21:18:01.144379 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerStarted","Data":"f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4"} Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.658073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.795871 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.796402 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c5b968869-pr98k" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" containerID="cri-o://4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" gracePeriod=10 Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.885013 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.887512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.896365 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935973 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.037956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.040758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.057784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.179703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerStarted","Data":"cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc"} Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.187544 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerID="4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" exitCode=0 Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.187588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b"} Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.209602 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550078-gspxv" podStartSLOduration=2.093888466 podStartE2EDuration="3.209583402s" podCreationTimestamp="2026-03-08 21:18:00 +0000 UTC" firstStartedPulling="2026-03-08 21:18:01.046659915 +0000 UTC m=+6382.442713948" lastFinishedPulling="2026-03-08 21:18:02.162354851 +0000 UTC m=+6383.558408884" observedRunningTime="2026-03-08 21:18:03.194979091 +0000 UTC m=+6384.591033114" watchObservedRunningTime="2026-03-08 21:18:03.209583402 +0000 UTC m=+6384.605637415" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.210651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.465661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657389 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657490 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.714274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb" (OuterVolumeSpecName: "kube-api-access-6zkhb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "kube-api-access-6zkhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.726934 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config" (OuterVolumeSpecName: "config") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.736241 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.742968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.748447 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764755 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764803 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764816 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764828 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764837 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:04 crc kubenswrapper[4885]: W0308 21:18:04.065043 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb658095_55a6_4c1a_a84b_23ad21d14212.slice/crio-60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae WatchSource:0}: Error finding container 60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae: Status 404 returned error can't find the container with id 60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.067896 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.199667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.199712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"4bf88b828aac67327d42c79896ad6c9847535e8fc25a30b67b13b6ef37494170"} Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.200092 4885 scope.go:117] "RemoveContainer" containerID="4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.202186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerStarted","Data":"60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae"} Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.238103 4885 scope.go:117] "RemoveContainer" containerID="b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.238560 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.246981 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.215572 4885 generic.go:334] "Generic (PLEG): container finished" podID="cb658095-55a6-4c1a-a84b-23ad21d14212" containerID="d901c636d39623e233e3469473700802dfec14237f6497e2b4cab381597ec10f" exitCode=0 Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.215635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerDied","Data":"d901c636d39623e233e3469473700802dfec14237f6497e2b4cab381597ec10f"} Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.220531 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerID="cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc" exitCode=0 Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.220617 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerDied","Data":"cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc"} Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.390228 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" path="/var/lib/kubelet/pods/f9f2857a-2ac8-4c56-8fa0-b153c52309f3/volumes" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.237961 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerStarted","Data":"04b0c5b1dd7e2164d5bcbebfe6ef55f4a69fc1d33f227cd891f00ba11679595d"} Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.238272 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.275868 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" podStartSLOduration=4.275841652 podStartE2EDuration="4.275841652s" podCreationTimestamp="2026-03-08 21:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:18:06.262156735 +0000 UTC m=+6387.658210798" watchObservedRunningTime="2026-03-08 21:18:06.275841652 +0000 UTC m=+6387.671895685" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.691169 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.730842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.737615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw" (OuterVolumeSpecName: "kube-api-access-6qlcw") pod "1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" (UID: "1f0d7d35-3a1c-4505-8fa3-190d8ec038ee"). InnerVolumeSpecName "kube-api-access-6qlcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.834040 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.253393 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.254945 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerDied","Data":"f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4"} Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.255112 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.351941 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.386320 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.663942 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664345 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664354 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664360 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664401 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="init" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="init" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664591 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664608 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.665272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667846 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667971 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667887 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.668269 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673328 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673491 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.679101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775454 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775503 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.793260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.823376 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.829529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.836339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.837719 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.007283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.362677 4885 scope.go:117] "RemoveContainer" containerID="b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.381635 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" path="/var/lib/kubelet/pods/4225aab0-53fa-4aa3-ac19-6827ea262916/volumes" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.544488 4885 scope.go:117] "RemoveContainer" containerID="3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.549689 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:09 crc kubenswrapper[4885]: W0308 21:18:09.551563 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87894214_b974_4fc7_b23d_d739fde2466f.slice/crio-e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a WatchSource:0}: Error finding container e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a: Status 404 returned error can't find the container with id e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.614080 4885 scope.go:117] "RemoveContainer" containerID="37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8" Mar 08 21:18:10 crc kubenswrapper[4885]: I0308 21:18:10.281456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerStarted","Data":"e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a"} Mar 08 21:18:11 crc kubenswrapper[4885]: I0308 21:18:11.369506 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:18:12 crc kubenswrapper[4885]: I0308 21:18:12.306637 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.212227 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.307251 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.307805 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" containerID="cri-o://8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" gracePeriod=10 Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.871419 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007520 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.016049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h" (OuterVolumeSpecName: "kube-api-access-jd22h") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "kube-api-access-jd22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.061181 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.069436 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.072755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config" (OuterVolumeSpecName: "config") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.080028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.083675 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110569 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110599 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110611 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110622 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110634 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110645 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353623 4885 generic.go:334] "Generic (PLEG): container finished" podID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" exitCode=0 Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"f44206b8ebbf0693d2d4ad48b7c945bd77de0d1345fd96143b142bf69f386619"} Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353727 4885 scope.go:117] "RemoveContainer" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353898 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.410786 4885 scope.go:117] "RemoveContainer" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.415698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.441045 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.510949 4885 scope.go:117] "RemoveContainer" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: E0308 21:18:14.516603 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": container with ID starting with 8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697 not found: ID does not exist" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.516638 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} err="failed to get container status \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": rpc error: code = NotFound desc = could not find container \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": container with ID starting with 8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697 not found: ID does not exist" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.516662 4885 scope.go:117] "RemoveContainer" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: E0308 21:18:14.517099 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": container with ID starting with cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516 not found: ID does not exist" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.517157 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516"} err="failed to get container status \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": rpc error: code = NotFound desc = could not find container \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": container with ID starting with cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516 not found: ID does not exist" Mar 08 21:18:15 crc kubenswrapper[4885]: I0308 21:18:15.384423 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" path="/var/lib/kubelet/pods/784dc7bd-2dba-4638-881d-7bb4b97fa26f/volumes" Mar 08 21:18:22 crc kubenswrapper[4885]: I0308 21:18:22.460903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerStarted","Data":"dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8"} Mar 08 21:18:22 crc kubenswrapper[4885]: I0308 21:18:22.482679 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" podStartSLOduration=2.555009774 podStartE2EDuration="14.48265927s" podCreationTimestamp="2026-03-08 21:18:08 +0000 UTC" firstStartedPulling="2026-03-08 21:18:09.554524216 +0000 UTC m=+6390.950578249" lastFinishedPulling="2026-03-08 21:18:21.482173682 +0000 UTC m=+6402.878227745" observedRunningTime="2026-03-08 21:18:22.480318237 +0000 UTC m=+6403.876372260" watchObservedRunningTime="2026-03-08 21:18:22.48265927 +0000 UTC m=+6403.878713293" Mar 08 21:18:35 crc kubenswrapper[4885]: I0308 21:18:35.655677 4885 generic.go:334] "Generic (PLEG): container finished" podID="87894214-b974-4fc7-b23d-d739fde2466f" containerID="dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8" exitCode=0 Mar 08 21:18:35 crc kubenswrapper[4885]: I0308 21:18:35.655753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerDied","Data":"dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8"} Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.305383 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.421278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.430561 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2" (OuterVolumeSpecName: "kube-api-access-8vzp2") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "kube-api-access-8vzp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.432594 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.432662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph" (OuterVolumeSpecName: "ceph") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.462004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.465255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory" (OuterVolumeSpecName: "inventory") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526623 4885 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526665 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526683 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526736 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526746 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.686725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerDied","Data":"e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a"} Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.687385 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.686799 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.522969 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.524912 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.524957 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.524983 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.524992 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.525014 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="init" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.525022 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="init" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.525931 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.526179 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.528293 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.533684 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.534891 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.535104 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.535338 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.540264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640730 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640841 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.641115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.641264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.745037 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752726 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752725 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752984 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.758704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.775573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.854808 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:47 crc kubenswrapper[4885]: I0308 21:18:47.494045 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:47 crc kubenswrapper[4885]: I0308 21:18:47.816076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerStarted","Data":"fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0"} Mar 08 21:18:48 crc kubenswrapper[4885]: I0308 21:18:48.833310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerStarted","Data":"9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487"} Mar 08 21:18:48 crc kubenswrapper[4885]: I0308 21:18:48.863436 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" podStartSLOduration=2.42773402 podStartE2EDuration="2.863414468s" podCreationTimestamp="2026-03-08 21:18:46 +0000 UTC" firstStartedPulling="2026-03-08 21:18:47.497364612 +0000 UTC m=+6428.893418645" lastFinishedPulling="2026-03-08 21:18:47.93304503 +0000 UTC m=+6429.329099093" observedRunningTime="2026-03-08 21:18:48.859801531 +0000 UTC m=+6430.255855594" watchObservedRunningTime="2026-03-08 21:18:48.863414468 +0000 UTC m=+6430.259468511" Mar 08 21:19:46 crc kubenswrapper[4885]: I0308 21:19:46.071852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:19:46 crc kubenswrapper[4885]: I0308 21:19:46.093616 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:19:47 crc kubenswrapper[4885]: I0308 21:19:47.385370 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" path="/var/lib/kubelet/pods/93292c62-a3f4-439e-98fd-85ff17958f38/volumes" Mar 08 21:19:48 crc kubenswrapper[4885]: I0308 21:19:48.044690 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:19:48 crc kubenswrapper[4885]: I0308 21:19:48.056780 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:19:49 crc kubenswrapper[4885]: I0308 21:19:49.389163 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" path="/var/lib/kubelet/pods/ef43c9be-bb15-4927-b520-fe1b5ea3cabb/volumes" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.380822 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.383818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.394513 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.553890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.553989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.554076 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656153 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.679484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.715856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.182280 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670588 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" exitCode=0 Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164"} Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"02a8a96a2f1f9be2101d93146b1848a38e54aef39da2adc96249ad0085caa00c"} Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.672607 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:19:52 crc kubenswrapper[4885]: I0308 21:19:52.692296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.038975 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.048935 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.384802 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" path="/var/lib/kubelet/pods/c4d6415b-d535-426e-a500-cd8e25255bde/volumes" Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.706996 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" exitCode=0 Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.707086 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.075811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.085705 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.718908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.752423 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qslr6" podStartSLOduration=2.332051724 podStartE2EDuration="4.752398891s" podCreationTimestamp="2026-03-08 21:19:50 +0000 UTC" firstStartedPulling="2026-03-08 21:19:51.672400442 +0000 UTC m=+6493.068454465" lastFinishedPulling="2026-03-08 21:19:54.092747569 +0000 UTC m=+6495.488801632" observedRunningTime="2026-03-08 21:19:54.737734088 +0000 UTC m=+6496.133788101" watchObservedRunningTime="2026-03-08 21:19:54.752398891 +0000 UTC m=+6496.148452944" Mar 08 21:19:55 crc kubenswrapper[4885]: I0308 21:19:55.379678 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" path="/var/lib/kubelet/pods/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f/volumes" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.160179 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.163116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.166561 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.166867 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.167069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.172083 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.290426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.393033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.428667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.492513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.717221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.717370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.788561 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.847436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.034234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.054739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.808362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerStarted","Data":"bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e"} Mar 08 21:20:02 crc kubenswrapper[4885]: I0308 21:20:02.817303 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qslr6" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" containerID="cri-o://da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" gracePeriod=2 Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.423164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.580941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities" (OuterVolumeSpecName: "utilities") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.590122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh" (OuterVolumeSpecName: "kube-api-access-dkrlh") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "kube-api-access-dkrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.606178 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683470 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683706 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683718 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.827246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerStarted","Data":"34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830858 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" exitCode=0 Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"02a8a96a2f1f9be2101d93146b1848a38e54aef39da2adc96249ad0085caa00c"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830966 4885 scope.go:117] "RemoveContainer" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830986 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.852848 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550080-q8z87" podStartSLOduration=2.494814973 podStartE2EDuration="3.852818784s" podCreationTimestamp="2026-03-08 21:20:00 +0000 UTC" firstStartedPulling="2026-03-08 21:20:01.058462292 +0000 UTC m=+6502.454516325" lastFinishedPulling="2026-03-08 21:20:02.416466073 +0000 UTC m=+6503.812520136" observedRunningTime="2026-03-08 21:20:03.849575897 +0000 UTC m=+6505.245629930" watchObservedRunningTime="2026-03-08 21:20:03.852818784 +0000 UTC m=+6505.248872807" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.908609 4885 scope.go:117] "RemoveContainer" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.912823 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.922999 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.934751 4885 scope.go:117] "RemoveContainer" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.001657 4885 scope.go:117] "RemoveContainer" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.002220 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": container with ID starting with da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61 not found: ID does not exist" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002303 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} err="failed to get container status \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": rpc error: code = NotFound desc = could not find container \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": container with ID starting with da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61 not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002365 4885 scope.go:117] "RemoveContainer" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.002876 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": container with ID starting with 2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e not found: ID does not exist" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002903 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} err="failed to get container status \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": rpc error: code = NotFound desc = could not find container \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": container with ID starting with 2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002984 4885 scope.go:117] "RemoveContainer" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.003324 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": container with ID starting with a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164 not found: ID does not exist" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.003357 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164"} err="failed to get container status \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": rpc error: code = NotFound desc = could not find container \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": container with ID starting with a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164 not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.849432 4885 generic.go:334] "Generic (PLEG): container finished" podID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerID="34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933" exitCode=0 Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.849512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerDied","Data":"34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933"} Mar 08 21:20:05 crc kubenswrapper[4885]: I0308 21:20:05.389527 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" path="/var/lib/kubelet/pods/55c2a47a-66e1-4e37-a252-04e2b98eb7bf/volumes" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.256451 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.448509 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"099da518-0e8c-4661-86bf-efcce5fd4f59\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.458027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8" (OuterVolumeSpecName: "kube-api-access-b62p8") pod "099da518-0e8c-4661-86bf-efcce5fd4f59" (UID: "099da518-0e8c-4661-86bf-efcce5fd4f59"). InnerVolumeSpecName "kube-api-access-b62p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.551941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerDied","Data":"bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e"} Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887752 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887820 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.951723 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.964776 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:20:07 crc kubenswrapper[4885]: I0308 21:20:07.393174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" path="/var/lib/kubelet/pods/3a0dd2e1-2283-49bf-b5d1-deb889245d93/volumes" Mar 08 21:20:09 crc kubenswrapper[4885]: I0308 21:20:09.927639 4885 scope.go:117] "RemoveContainer" containerID="4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9" Mar 08 21:20:09 crc kubenswrapper[4885]: I0308 21:20:09.962501 4885 scope.go:117] "RemoveContainer" containerID="52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.023266 4885 scope.go:117] "RemoveContainer" containerID="10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.066809 4885 scope.go:117] "RemoveContainer" containerID="787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.119847 4885 scope.go:117] "RemoveContainer" containerID="49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598" Mar 08 21:20:24 crc kubenswrapper[4885]: I0308 21:20:24.052149 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:20:24 crc kubenswrapper[4885]: I0308 21:20:24.063450 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:20:25 crc kubenswrapper[4885]: I0308 21:20:25.387491 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" path="/var/lib/kubelet/pods/020ad790-8a8c-4e05-b3da-d6b823bb37e2/volumes" Mar 08 21:20:32 crc kubenswrapper[4885]: I0308 21:20:32.818087 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:20:32 crc kubenswrapper[4885]: I0308 21:20:32.818614 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.086715 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088409 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-utilities" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-utilities" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088500 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088519 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088553 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-content" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088571 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-content" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088633 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.089275 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.089308 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.093182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.109404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.302047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.302283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.328155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.423013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.957337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310533 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" exitCode=0 Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a"} Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310985 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"3d026d2edcc32f59d1fb64048cf1fa1e8e9381630009926a8e768bbf9b68b079"} Mar 08 21:20:44 crc kubenswrapper[4885]: I0308 21:20:44.331610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} Mar 08 21:20:50 crc kubenswrapper[4885]: I0308 21:20:50.411222 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" exitCode=0 Mar 08 21:20:50 crc kubenswrapper[4885]: I0308 21:20:50.411416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} Mar 08 21:20:51 crc kubenswrapper[4885]: I0308 21:20:51.424117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} Mar 08 21:20:51 crc kubenswrapper[4885]: I0308 21:20:51.445762 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krhv4" podStartSLOduration=1.9560979330000001 podStartE2EDuration="10.445732456s" podCreationTimestamp="2026-03-08 21:20:41 +0000 UTC" firstStartedPulling="2026-03-08 21:20:42.312720018 +0000 UTC m=+6543.708774051" lastFinishedPulling="2026-03-08 21:20:50.802354551 +0000 UTC m=+6552.198408574" observedRunningTime="2026-03-08 21:20:51.444984726 +0000 UTC m=+6552.841038759" watchObservedRunningTime="2026-03-08 21:20:51.445732456 +0000 UTC m=+6552.841786529" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.423452 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.424163 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.498054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.614056 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.760374 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:02 crc kubenswrapper[4885]: I0308 21:21:02.819296 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:21:02 crc kubenswrapper[4885]: I0308 21:21:02.820145 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:21:03 crc kubenswrapper[4885]: I0308 21:21:03.573660 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krhv4" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" containerID="cri-o://cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" gracePeriod=2 Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.170806 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.254729 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities" (OuterVolumeSpecName: "utilities") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.268111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8" (OuterVolumeSpecName: "kube-api-access-t9lh8") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "kube-api-access-t9lh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.343546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354295 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354335 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354344 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593246 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" exitCode=0 Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593306 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"3d026d2edcc32f59d1fb64048cf1fa1e8e9381630009926a8e768bbf9b68b079"} Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593377 4885 scope.go:117] "RemoveContainer" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593569 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.627999 4885 scope.go:117] "RemoveContainer" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.637327 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.663700 4885 scope.go:117] "RemoveContainer" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.664724 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.708302 4885 scope.go:117] "RemoveContainer" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.708935 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": container with ID starting with cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1 not found: ID does not exist" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709005 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} err="failed to get container status \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": rpc error: code = NotFound desc = could not find container \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": container with ID starting with cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1 not found: ID does not exist" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709036 4885 scope.go:117] "RemoveContainer" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.709485 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": container with ID starting with b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc not found: ID does not exist" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709545 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} err="failed to get container status \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": rpc error: code = NotFound desc = could not find container \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": container with ID starting with b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc not found: ID does not exist" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709594 4885 scope.go:117] "RemoveContainer" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.709987 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": container with ID starting with a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a not found: ID does not exist" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.710024 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a"} err="failed to get container status \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": rpc error: code = NotFound desc = could not find container \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": container with ID starting with a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a not found: ID does not exist" Mar 08 21:21:05 crc kubenswrapper[4885]: I0308 21:21:05.387044 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" path="/var/lib/kubelet/pods/86b928a5-44bc-4427-93bf-03426bea7ef0/volumes" Mar 08 21:21:10 crc kubenswrapper[4885]: I0308 21:21:10.370758 4885 scope.go:117] "RemoveContainer" containerID="3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84" Mar 08 21:21:10 crc kubenswrapper[4885]: I0308 21:21:10.418494 4885 scope.go:117] "RemoveContainer" containerID="952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.818155 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.818976 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.819052 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.820173 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.820280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" gracePeriod=600 Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955193 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" exitCode=0 Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955265 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:21:33 crc kubenswrapper[4885]: I0308 21:21:33.967513 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.154207 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155531 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155566 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-utilities" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155575 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-utilities" Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155594 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-content" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155602 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-content" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155903 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.156844 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159789 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159993 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.169536 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.208808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.311349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.336093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.478835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.986712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:01 crc kubenswrapper[4885]: I0308 21:22:01.393117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerStarted","Data":"ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a"} Mar 08 21:22:02 crc kubenswrapper[4885]: I0308 21:22:02.405267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerStarted","Data":"d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b"} Mar 08 21:22:02 crc kubenswrapper[4885]: I0308 21:22:02.426162 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" podStartSLOduration=1.388172011 podStartE2EDuration="2.426139732s" podCreationTimestamp="2026-03-08 21:22:00 +0000 UTC" firstStartedPulling="2026-03-08 21:22:00.987550252 +0000 UTC m=+6622.383604305" lastFinishedPulling="2026-03-08 21:22:02.025517983 +0000 UTC m=+6623.421572026" observedRunningTime="2026-03-08 21:22:02.420564272 +0000 UTC m=+6623.816618305" watchObservedRunningTime="2026-03-08 21:22:02.426139732 +0000 UTC m=+6623.822193775" Mar 08 21:22:03 crc kubenswrapper[4885]: I0308 21:22:03.420578 4885 generic.go:334] "Generic (PLEG): container finished" podID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerID="d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b" exitCode=0 Mar 08 21:22:03 crc kubenswrapper[4885]: I0308 21:22:03.421150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerDied","Data":"d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b"} Mar 08 21:22:04 crc kubenswrapper[4885]: I0308 21:22:04.915761 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.029854 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.034908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j" (OuterVolumeSpecName: "kube-api-access-sbt2j") pod "b683b024-7ab4-40e6-9380-ad5f3c4c9751" (UID: "b683b024-7ab4-40e6-9380-ad5f3c4c9751"). InnerVolumeSpecName "kube-api-access-sbt2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.133113 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") on node \"crc\" DevicePath \"\"" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerDied","Data":"ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a"} Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450808 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450873 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.526272 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.540297 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:22:07 crc kubenswrapper[4885]: I0308 21:22:07.381005 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" path="/var/lib/kubelet/pods/6963ac4b-0b7b-489f-a98a-7bad7270d510/volumes" Mar 08 21:22:10 crc kubenswrapper[4885]: I0308 21:22:10.493964 4885 scope.go:117] "RemoveContainer" containerID="e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90" Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.051446 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.066028 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.383402 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" path="/var/lib/kubelet/pods/b956841b-a9a1-4c38-99e9-05c6e5f9f363/volumes" Mar 08 21:23:08 crc kubenswrapper[4885]: I0308 21:23:08.064284 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:23:08 crc kubenswrapper[4885]: I0308 21:23:08.076787 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:23:09 crc kubenswrapper[4885]: I0308 21:23:09.380571 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" path="/var/lib/kubelet/pods/49449c65-7a7c-437f-b4d9-23b2a219485f/volumes" Mar 08 21:23:10 crc kubenswrapper[4885]: I0308 21:23:10.634936 4885 scope.go:117] "RemoveContainer" containerID="d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21" Mar 08 21:23:10 crc kubenswrapper[4885]: I0308 21:23:10.658116 4885 scope.go:117] "RemoveContainer" containerID="9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782" Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.058630 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.067805 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.385101 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" path="/var/lib/kubelet/pods/495d39cf-6a4d-4ca0-90b6-9a22323d1568/volumes" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.156850 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:00 crc kubenswrapper[4885]: E0308 21:24:00.158438 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.158464 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.158837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.160690 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.165370 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.165775 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.166047 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.178978 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.286288 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.388818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.412339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.485233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:01 crc kubenswrapper[4885]: I0308 21:24:01.051224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:01 crc kubenswrapper[4885]: I0308 21:24:01.229795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerStarted","Data":"b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a"} Mar 08 21:24:02 crc kubenswrapper[4885]: I0308 21:24:02.818410 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:24:02 crc kubenswrapper[4885]: I0308 21:24:02.819062 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:24:03 crc kubenswrapper[4885]: I0308 21:24:03.257825 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerID="b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359" exitCode=0 Mar 08 21:24:03 crc kubenswrapper[4885]: I0308 21:24:03.257888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerDied","Data":"b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359"} Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.769621 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.898877 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.910099 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp" (OuterVolumeSpecName: "kube-api-access-sjzxp") pod "3d2e43f9-d4b1-4059-b714-26745e0d96ce" (UID: "3d2e43f9-d4b1-4059-b714-26745e0d96ce"). InnerVolumeSpecName "kube-api-access-sjzxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.002246 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.285186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerDied","Data":"b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a"} Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.286253 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.285268 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.871743 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.882740 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:24:07 crc kubenswrapper[4885]: I0308 21:24:07.382938 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" path="/var/lib/kubelet/pods/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee/volumes" Mar 08 21:24:10 crc kubenswrapper[4885]: I0308 21:24:10.764004 4885 scope.go:117] "RemoveContainer" containerID="cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc" Mar 08 21:24:10 crc kubenswrapper[4885]: I0308 21:24:10.854223 4885 scope.go:117] "RemoveContainer" containerID="27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a" Mar 08 21:24:32 crc kubenswrapper[4885]: I0308 21:24:32.818163 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:24:32 crc kubenswrapper[4885]: I0308 21:24:32.818735 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.958723 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:42 crc kubenswrapper[4885]: E0308 21:24:42.960126 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.960149 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.960508 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.963269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.973317 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041725 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144661 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.166534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.307220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.886089 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.820732 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" exitCode=0 Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.820808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574"} Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.821023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"da4afd917b03cf9f273dbbf6900868bc97461a3068639de562d9b471c1e46179"} Mar 08 21:24:45 crc kubenswrapper[4885]: I0308 21:24:45.846432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} Mar 08 21:24:47 crc kubenswrapper[4885]: I0308 21:24:47.874214 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" exitCode=0 Mar 08 21:24:47 crc kubenswrapper[4885]: I0308 21:24:47.874300 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} Mar 08 21:24:48 crc kubenswrapper[4885]: I0308 21:24:48.888995 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} Mar 08 21:24:48 crc kubenswrapper[4885]: I0308 21:24:48.930306 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7w5nj" podStartSLOduration=3.403980383 podStartE2EDuration="6.930269751s" podCreationTimestamp="2026-03-08 21:24:42 +0000 UTC" firstStartedPulling="2026-03-08 21:24:44.823105821 +0000 UTC m=+6786.219159844" lastFinishedPulling="2026-03-08 21:24:48.349395189 +0000 UTC m=+6789.745449212" observedRunningTime="2026-03-08 21:24:48.924382164 +0000 UTC m=+6790.320436217" watchObservedRunningTime="2026-03-08 21:24:48.930269751 +0000 UTC m=+6790.326323804" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.308127 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.308733 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.363370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:54 crc kubenswrapper[4885]: I0308 21:24:54.005059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:54 crc kubenswrapper[4885]: I0308 21:24:54.057892 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:55 crc kubenswrapper[4885]: I0308 21:24:55.960356 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7w5nj" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" containerID="cri-o://fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" gracePeriod=2 Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.509299 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571632 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571823 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.574236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities" (OuterVolumeSpecName: "utilities") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.582186 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7" (OuterVolumeSpecName: "kube-api-access-xrst7") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "kube-api-access-xrst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.648681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674668 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674700 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674711 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975676 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" exitCode=0 Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"da4afd917b03cf9f273dbbf6900868bc97461a3068639de562d9b471c1e46179"} Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975758 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975823 4885 scope.go:117] "RemoveContainer" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.027313 4885 scope.go:117] "RemoveContainer" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.029776 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.041314 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.064100 4885 scope.go:117] "RemoveContainer" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.105201 4885 scope.go:117] "RemoveContainer" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.105950 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": container with ID starting with fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653 not found: ID does not exist" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106006 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} err="failed to get container status \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": rpc error: code = NotFound desc = could not find container \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": container with ID starting with fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653 not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106037 4885 scope.go:117] "RemoveContainer" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.106870 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": container with ID starting with 2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee not found: ID does not exist" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106946 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} err="failed to get container status \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": rpc error: code = NotFound desc = could not find container \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": container with ID starting with 2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106981 4885 scope.go:117] "RemoveContainer" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.107447 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": container with ID starting with 9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574 not found: ID does not exist" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.107480 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574"} err="failed to get container status \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": rpc error: code = NotFound desc = could not find container \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": container with ID starting with 9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574 not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.395831 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" path="/var/lib/kubelet/pods/cded65d6-06d8-49c9-8bc3-0223d72ee23c/volumes" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.817671 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.818844 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.819148 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.819977 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.820092 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" gracePeriod=600 Mar 08 21:25:02 crc kubenswrapper[4885]: E0308 21:25:02.938267 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042282 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" exitCode=0 Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042349 4885 scope.go:117] "RemoveContainer" containerID="17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042991 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:03 crc kubenswrapper[4885]: E0308 21:25:03.043273 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:14 crc kubenswrapper[4885]: I0308 21:25:14.368721 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:14 crc kubenswrapper[4885]: E0308 21:25:14.369522 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:29 crc kubenswrapper[4885]: I0308 21:25:29.382252 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:29 crc kubenswrapper[4885]: E0308 21:25:29.383149 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:41 crc kubenswrapper[4885]: I0308 21:25:41.369045 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:41 crc kubenswrapper[4885]: E0308 21:25:41.370699 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:55 crc kubenswrapper[4885]: I0308 21:25:55.370779 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:55 crc kubenswrapper[4885]: E0308 21:25:55.372108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.066602 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.083599 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.098851 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.109868 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:25:57 crc kubenswrapper[4885]: I0308 21:25:57.388944 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" path="/var/lib/kubelet/pods/92cde86b-0d50-444d-b116-e32fbf5004f9/volumes" Mar 08 21:25:57 crc kubenswrapper[4885]: I0308 21:25:57.392748 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" path="/var/lib/kubelet/pods/b454a1c4-958a-40a9-8c50-9154281574fd/volumes" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.175873 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.176899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176952 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-content" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.176962 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-content" Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-utilities" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.177001 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-utilities" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.177260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.178159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.181279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.181628 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.182556 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.187658 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.357468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.459543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.488784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.514206 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.111428 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.119229 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.770710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerStarted","Data":"68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce"} Mar 08 21:26:02 crc kubenswrapper[4885]: I0308 21:26:02.784710 4885 generic.go:334] "Generic (PLEG): container finished" podID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerID="a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637" exitCode=0 Mar 08 21:26:02 crc kubenswrapper[4885]: I0308 21:26:02.784795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerDied","Data":"a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637"} Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.241903 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.362895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.373244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l" (OuterVolumeSpecName: "kube-api-access-z448l") pod "e5e490c3-347d-4c4b-aa26-3f680e0bebc0" (UID: "e5e490c3-347d-4c4b-aa26-3f680e0bebc0"). InnerVolumeSpecName "kube-api-access-z448l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.465941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") on node \"crc\" DevicePath \"\"" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerDied","Data":"68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce"} Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811565 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811616 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.343945 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.358053 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.382369 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" path="/var/lib/kubelet/pods/099da518-0e8c-4661-86bf-efcce5fd4f59/volumes" Mar 08 21:26:07 crc kubenswrapper[4885]: I0308 21:26:07.369384 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:07 crc kubenswrapper[4885]: E0308 21:26:07.370024 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.036440 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.048342 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.388691 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" path="/var/lib/kubelet/pods/ddddf0b1-83be-4ebb-8318-9d40522a3efb/volumes" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.045966 4885 scope.go:117] "RemoveContainer" containerID="a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.095187 4885 scope.go:117] "RemoveContainer" containerID="50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.167401 4885 scope.go:117] "RemoveContainer" containerID="870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.209177 4885 scope.go:117] "RemoveContainer" containerID="34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933" Mar 08 21:26:22 crc kubenswrapper[4885]: I0308 21:26:22.369472 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:22 crc kubenswrapper[4885]: E0308 21:26:22.370579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.036437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.049557 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.060312 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.069986 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.368362 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:33 crc kubenswrapper[4885]: E0308 21:26:33.368622 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.379533 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" path="/var/lib/kubelet/pods/451cc09f-d6aa-4930-be69-102ce5b86575/volumes" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.380583 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" path="/var/lib/kubelet/pods/835eb61f-3559-41d5-9891-23a6ecef9ed1/volumes" Mar 08 21:26:45 crc kubenswrapper[4885]: I0308 21:26:45.369092 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:45 crc kubenswrapper[4885]: E0308 21:26:45.370768 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:46 crc kubenswrapper[4885]: I0308 21:26:46.056234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:26:46 crc kubenswrapper[4885]: I0308 21:26:46.066810 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:26:47 crc kubenswrapper[4885]: I0308 21:26:47.395077 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4393565c-775a-48fd-a497-602a556ff169" path="/var/lib/kubelet/pods/4393565c-775a-48fd-a497-602a556ff169/volumes" Mar 08 21:26:58 crc kubenswrapper[4885]: I0308 21:26:58.369005 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:58 crc kubenswrapper[4885]: E0308 21:26:58.370226 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.372791 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:11 crc kubenswrapper[4885]: E0308 21:27:11.374224 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.374424 4885 scope.go:117] "RemoveContainer" containerID="8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.440969 4885 scope.go:117] "RemoveContainer" containerID="ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.515061 4885 scope.go:117] "RemoveContainer" containerID="ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c" Mar 08 21:27:23 crc kubenswrapper[4885]: I0308 21:27:23.369295 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:23 crc kubenswrapper[4885]: E0308 21:27:23.370734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:34 crc kubenswrapper[4885]: I0308 21:27:34.368652 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:34 crc kubenswrapper[4885]: E0308 21:27:34.369734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:48 crc kubenswrapper[4885]: I0308 21:27:48.368897 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:48 crc kubenswrapper[4885]: E0308 21:27:48.370587 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.159074 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:00 crc kubenswrapper[4885]: E0308 21:28:00.160323 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.160349 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.160744 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.161997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.164390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.164610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.168503 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.180369 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.262720 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.363737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.384880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.518189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:01 crc kubenswrapper[4885]: I0308 21:28:01.020318 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:01 crc kubenswrapper[4885]: I0308 21:28:01.273982 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerStarted","Data":"83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52"} Mar 08 21:28:02 crc kubenswrapper[4885]: I0308 21:28:02.369192 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:02 crc kubenswrapper[4885]: E0308 21:28:02.370610 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:03 crc kubenswrapper[4885]: I0308 21:28:03.301720 4885 generic.go:334] "Generic (PLEG): container finished" podID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerID="38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd" exitCode=0 Mar 08 21:28:03 crc kubenswrapper[4885]: I0308 21:28:03.301809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerDied","Data":"38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd"} Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.747108 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.761847 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.771992 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt" (OuterVolumeSpecName: "kube-api-access-fhpwt") pod "ec092c86-e0c8-415e-bfe7-80914fe8ce5b" (UID: "ec092c86-e0c8-415e-bfe7-80914fe8ce5b"). InnerVolumeSpecName "kube-api-access-fhpwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.865078 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327221 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerDied","Data":"83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52"} Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327645 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327321 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.835031 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.843704 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:28:07 crc kubenswrapper[4885]: I0308 21:28:07.387258 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" path="/var/lib/kubelet/pods/b683b024-7ab4-40e6-9380-ad5f3c4c9751/volumes" Mar 08 21:28:11 crc kubenswrapper[4885]: I0308 21:28:11.646626 4885 scope.go:117] "RemoveContainer" containerID="d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.121372 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:14 crc kubenswrapper[4885]: E0308 21:28:14.122462 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.122477 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.122734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.124610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.148440 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.303907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.304247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.304329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.406348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.406583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.429809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.447417 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.955394 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.456940 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" exitCode=0 Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.456988 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc"} Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.457219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"07f2ce8eb5cf4e9dbca7c1fd51d8cad03cbdd45dcff7bdcae0d172c480181377"} Mar 08 21:28:16 crc kubenswrapper[4885]: I0308 21:28:16.369609 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:16 crc kubenswrapper[4885]: E0308 21:28:16.370598 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:16 crc kubenswrapper[4885]: I0308 21:28:16.468502 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} Mar 08 21:28:22 crc kubenswrapper[4885]: I0308 21:28:22.542072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} Mar 08 21:28:22 crc kubenswrapper[4885]: I0308 21:28:22.542029 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" exitCode=0 Mar 08 21:28:24 crc kubenswrapper[4885]: I0308 21:28:24.574692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} Mar 08 21:28:24 crc kubenswrapper[4885]: I0308 21:28:24.619508 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-727hb" podStartSLOduration=2.717009545 podStartE2EDuration="10.619486503s" podCreationTimestamp="2026-03-08 21:28:14 +0000 UTC" firstStartedPulling="2026-03-08 21:28:15.458436939 +0000 UTC m=+6996.854490962" lastFinishedPulling="2026-03-08 21:28:23.360913867 +0000 UTC m=+7004.756967920" observedRunningTime="2026-03-08 21:28:24.598911967 +0000 UTC m=+7005.994966040" watchObservedRunningTime="2026-03-08 21:28:24.619486503 +0000 UTC m=+7006.015540536" Mar 08 21:28:31 crc kubenswrapper[4885]: I0308 21:28:31.371451 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:31 crc kubenswrapper[4885]: E0308 21:28:31.372234 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.448135 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.449286 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.522888 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.804694 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.876513 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:36 crc kubenswrapper[4885]: I0308 21:28:36.763280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-727hb" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" containerID="cri-o://a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" gracePeriod=2 Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.326309 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.374252 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities" (OuterVolumeSpecName: "utilities") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.375695 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.383004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md" (OuterVolumeSpecName: "kube-api-access-dz8md") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "kube-api-access-dz8md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.477586 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.510636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.579438 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779548 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" exitCode=0 Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779674 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779705 4885 scope.go:117] "RemoveContainer" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779686 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"07f2ce8eb5cf4e9dbca7c1fd51d8cad03cbdd45dcff7bdcae0d172c480181377"} Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.831830 4885 scope.go:117] "RemoveContainer" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.835505 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.846585 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.876623 4885 scope.go:117] "RemoveContainer" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.937482 4885 scope.go:117] "RemoveContainer" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.938392 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": container with ID starting with a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10 not found: ID does not exist" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.938464 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} err="failed to get container status \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": rpc error: code = NotFound desc = could not find container \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": container with ID starting with a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10 not found: ID does not exist" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.938506 4885 scope.go:117] "RemoveContainer" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.939190 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": container with ID starting with d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf not found: ID does not exist" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.939292 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} err="failed to get container status \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": rpc error: code = NotFound desc = could not find container \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": container with ID starting with d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf not found: ID does not exist" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.939381 4885 scope.go:117] "RemoveContainer" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.939974 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": container with ID starting with 8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc not found: ID does not exist" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.940037 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc"} err="failed to get container status \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": rpc error: code = NotFound desc = could not find container \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": container with ID starting with 8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc not found: ID does not exist" Mar 08 21:28:39 crc kubenswrapper[4885]: I0308 21:28:39.393684 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" path="/var/lib/kubelet/pods/c690d770-1f1e-4e17-991c-4a7696a26cea/volumes" Mar 08 21:28:46 crc kubenswrapper[4885]: I0308 21:28:46.368152 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:46 crc kubenswrapper[4885]: E0308 21:28:46.368875 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:58 crc kubenswrapper[4885]: I0308 21:28:58.369306 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:58 crc kubenswrapper[4885]: E0308 21:28:58.370314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:11 crc kubenswrapper[4885]: I0308 21:29:11.179683 4885 generic.go:334] "Generic (PLEG): container finished" podID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerID="9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487" exitCode=0 Mar 08 21:29:11 crc kubenswrapper[4885]: I0308 21:29:11.179842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerDied","Data":"9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487"} Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.856649 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.978495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.978926 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979092 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979656 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.985098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.987068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt" (OuterVolumeSpecName: "kube-api-access-pc9lt") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "kube-api-access-pc9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.989280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph" (OuterVolumeSpecName: "ceph") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.021129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.033554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory" (OuterVolumeSpecName: "inventory") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082099 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082137 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082147 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082159 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082174 4885 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerDied","Data":"fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0"} Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206102 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206173 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.369087 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:13 crc kubenswrapper[4885]: E0308 21:29:13.369415 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.367320 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368377 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368394 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368416 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368426 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368483 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-content" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368491 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-content" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368514 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-utilities" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368523 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-utilities" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368754 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368777 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.381601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.381757 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.388417 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.388598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.389440 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.389583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643544 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650520 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.651976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.666006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.730856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:16 crc kubenswrapper[4885]: I0308 21:29:16.296470 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.253006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerStarted","Data":"406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac"} Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.253715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerStarted","Data":"402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced"} Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.276864 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" podStartSLOduration=1.714823956 podStartE2EDuration="2.276839959s" podCreationTimestamp="2026-03-08 21:29:15 +0000 UTC" firstStartedPulling="2026-03-08 21:29:16.305297864 +0000 UTC m=+7057.701351887" lastFinishedPulling="2026-03-08 21:29:16.867313837 +0000 UTC m=+7058.263367890" observedRunningTime="2026-03-08 21:29:17.273556532 +0000 UTC m=+7058.669610595" watchObservedRunningTime="2026-03-08 21:29:17.276839959 +0000 UTC m=+7058.672893982" Mar 08 21:29:25 crc kubenswrapper[4885]: I0308 21:29:25.369020 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:25 crc kubenswrapper[4885]: E0308 21:29:25.369849 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:38 crc kubenswrapper[4885]: I0308 21:29:38.367991 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:38 crc kubenswrapper[4885]: E0308 21:29:38.369667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:52 crc kubenswrapper[4885]: I0308 21:29:52.368084 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:52 crc kubenswrapper[4885]: E0308 21:29:52.368906 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.167780 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.169665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173194 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173691 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.186956 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.188709 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.190408 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.190654 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.199121 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.205602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.236443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338302 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338684 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338935 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.359565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441335 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.442613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.446134 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.468352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.518382 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.526240 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.042135 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.115760 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.765674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerStarted","Data":"bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec"} Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768064 4885 generic.go:334] "Generic (PLEG): container finished" podID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerID="8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb" exitCode=0 Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerDied","Data":"8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb"} Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerStarted","Data":"d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97"} Mar 08 21:30:02 crc kubenswrapper[4885]: I0308 21:30:02.781035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerStarted","Data":"7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700"} Mar 08 21:30:02 crc kubenswrapper[4885]: I0308 21:30:02.825906 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550090-hww59" podStartSLOduration=1.590647484 podStartE2EDuration="2.82588407s" podCreationTimestamp="2026-03-08 21:30:00 +0000 UTC" firstStartedPulling="2026-03-08 21:30:01.127701482 +0000 UTC m=+7102.523755515" lastFinishedPulling="2026-03-08 21:30:02.362938078 +0000 UTC m=+7103.758992101" observedRunningTime="2026-03-08 21:30:02.809059444 +0000 UTC m=+7104.205113467" watchObservedRunningTime="2026-03-08 21:30:02.82588407 +0000 UTC m=+7104.221938093" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.353989 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.416990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417882 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.423982 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.424116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t" (OuterVolumeSpecName: "kube-api-access-b6s5t") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "kube-api-access-b6s5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520450 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520491 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520511 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.795588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerDied","Data":"d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97"} Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.796106 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.795612 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.798178 4885 generic.go:334] "Generic (PLEG): container finished" podID="ab193869-7f8c-4475-8be4-393848bd54e3" containerID="7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700" exitCode=0 Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.798232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerDied","Data":"7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700"} Mar 08 21:30:04 crc kubenswrapper[4885]: I0308 21:30:04.462019 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 21:30:04 crc kubenswrapper[4885]: I0308 21:30:04.476197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.262821 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.364685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"ab193869-7f8c-4475-8be4-393848bd54e3\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.369659 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7" (OuterVolumeSpecName: "kube-api-access-74tb7") pod "ab193869-7f8c-4475-8be4-393848bd54e3" (UID: "ab193869-7f8c-4475-8be4-393848bd54e3"). InnerVolumeSpecName "kube-api-access-74tb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.370090 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.387070 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" path="/var/lib/kubelet/pods/414af8b3-3809-477a-a110-9acaf82a7a3b/volumes" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.467624 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.820207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerDied","Data":"bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec"} Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822479 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.893615 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.907558 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:30:07 crc kubenswrapper[4885]: I0308 21:30:07.412722 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" path="/var/lib/kubelet/pods/3d2e43f9-d4b1-4059-b714-26745e0d96ce/volumes" Mar 08 21:30:11 crc kubenswrapper[4885]: I0308 21:30:11.812202 4885 scope.go:117] "RemoveContainer" containerID="62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a" Mar 08 21:30:11 crc kubenswrapper[4885]: I0308 21:30:11.862720 4885 scope.go:117] "RemoveContainer" containerID="b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.581811 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:38 crc kubenswrapper[4885]: E0308 21:30:38.586447 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586470 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: E0308 21:30:38.586512 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586524 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586800 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.588888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.594287 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769787 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872938 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872968 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.904755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.933290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:39 crc kubenswrapper[4885]: I0308 21:30:39.429396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276015 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" exitCode=0 Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d"} Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276424 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"a7e029066f7d884b7eac13ca0e4540905bf51badb3b42685af6bd3d5e88e7bf2"} Mar 08 21:30:41 crc kubenswrapper[4885]: I0308 21:30:41.286338 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} Mar 08 21:30:42 crc kubenswrapper[4885]: I0308 21:30:42.300311 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" exitCode=0 Mar 08 21:30:42 crc kubenswrapper[4885]: I0308 21:30:42.300795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} Mar 08 21:30:43 crc kubenswrapper[4885]: I0308 21:30:43.315387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} Mar 08 21:30:43 crc kubenswrapper[4885]: I0308 21:30:43.336312 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsqw2" podStartSLOduration=2.89831784 podStartE2EDuration="5.336287651s" podCreationTimestamp="2026-03-08 21:30:38 +0000 UTC" firstStartedPulling="2026-03-08 21:30:40.281129393 +0000 UTC m=+7141.677183456" lastFinishedPulling="2026-03-08 21:30:42.719099244 +0000 UTC m=+7144.115153267" observedRunningTime="2026-03-08 21:30:43.33288698 +0000 UTC m=+7144.728941043" watchObservedRunningTime="2026-03-08 21:30:43.336287651 +0000 UTC m=+7144.732341684" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.933528 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.934158 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.979548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:49 crc kubenswrapper[4885]: I0308 21:30:49.486408 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:49 crc kubenswrapper[4885]: I0308 21:30:49.559749 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:51 crc kubenswrapper[4885]: I0308 21:30:51.410267 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gsqw2" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" containerID="cri-o://cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" gracePeriod=2 Mar 08 21:30:51 crc kubenswrapper[4885]: I0308 21:30:51.929800 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009269 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.010777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities" (OuterVolumeSpecName: "utilities") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.018476 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm" (OuterVolumeSpecName: "kube-api-access-6ndmm") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "kube-api-access-6ndmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.045364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111465 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111546 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419813 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" exitCode=0 Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419870 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.420119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"a7e029066f7d884b7eac13ca0e4540905bf51badb3b42685af6bd3d5e88e7bf2"} Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419908 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.420139 4885 scope.go:117] "RemoveContainer" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.447665 4885 scope.go:117] "RemoveContainer" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.480082 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.490716 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.496996 4885 scope.go:117] "RemoveContainer" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.551826 4885 scope.go:117] "RemoveContainer" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.552341 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": container with ID starting with cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf not found: ID does not exist" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.552445 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} err="failed to get container status \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": rpc error: code = NotFound desc = could not find container \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": container with ID starting with cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf not found: ID does not exist" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.552478 4885 scope.go:117] "RemoveContainer" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.553040 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": container with ID starting with f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c not found: ID does not exist" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553080 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} err="failed to get container status \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": rpc error: code = NotFound desc = could not find container \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": container with ID starting with f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c not found: ID does not exist" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553107 4885 scope.go:117] "RemoveContainer" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.553412 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": container with ID starting with ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d not found: ID does not exist" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553440 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d"} err="failed to get container status \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": rpc error: code = NotFound desc = could not find container \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": container with ID starting with ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d not found: ID does not exist" Mar 08 21:30:53 crc kubenswrapper[4885]: I0308 21:30:53.383127 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" path="/var/lib/kubelet/pods/746e5174-2bf4-4698-b846-9bf402677b6f/volumes" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.210360 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213139 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-utilities" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-utilities" Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213459 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213548 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-content" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213627 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-content" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.214013 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.216275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.252021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.339778 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.339852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.340050 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442258 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.443160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.445524 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.463685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.595282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.099544 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630325 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" exitCode=0 Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1"} Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"6d3789bc5062d4a5f67bb288294750b1cd9355abc2ea912909328e88879e17f3"} Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.634977 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:31:11 crc kubenswrapper[4885]: I0308 21:31:11.643477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} Mar 08 21:31:13 crc kubenswrapper[4885]: I0308 21:31:13.664525 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" exitCode=0 Mar 08 21:31:13 crc kubenswrapper[4885]: I0308 21:31:13.664620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} Mar 08 21:31:14 crc kubenswrapper[4885]: I0308 21:31:14.687366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} Mar 08 21:31:14 crc kubenswrapper[4885]: I0308 21:31:14.718006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xwfb7" podStartSLOduration=2.16202467 podStartE2EDuration="5.717986405s" podCreationTimestamp="2026-03-08 21:31:09 +0000 UTC" firstStartedPulling="2026-03-08 21:31:10.634410921 +0000 UTC m=+7172.030464984" lastFinishedPulling="2026-03-08 21:31:14.190372666 +0000 UTC m=+7175.586426719" observedRunningTime="2026-03-08 21:31:14.70952472 +0000 UTC m=+7176.105578773" watchObservedRunningTime="2026-03-08 21:31:14.717986405 +0000 UTC m=+7176.114040438" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.595847 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.596483 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.667502 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.819544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.911514 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:21 crc kubenswrapper[4885]: I0308 21:31:21.763837 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xwfb7" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" containerID="cri-o://b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" gracePeriod=2 Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.324309 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.366230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities" (OuterVolumeSpecName: "utilities") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.373409 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs" (OuterVolumeSpecName: "kube-api-access-lvbzs") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "kube-api-access-lvbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.420082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462675 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462712 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462797 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780556 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" exitCode=0 Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"6d3789bc5062d4a5f67bb288294750b1cd9355abc2ea912909328e88879e17f3"} Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.781019 4885 scope.go:117] "RemoveContainer" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.781221 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.805176 4885 scope.go:117] "RemoveContainer" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.854304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.862510 4885 scope.go:117] "RemoveContainer" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.885505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.909597 4885 scope.go:117] "RemoveContainer" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910004 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": container with ID starting with b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298 not found: ID does not exist" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910043 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} err="failed to get container status \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": rpc error: code = NotFound desc = could not find container \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": container with ID starting with b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298 not found: ID does not exist" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910065 4885 scope.go:117] "RemoveContainer" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910363 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": container with ID starting with 5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a not found: ID does not exist" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910413 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} err="failed to get container status \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": rpc error: code = NotFound desc = could not find container \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": container with ID starting with 5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a not found: ID does not exist" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910444 4885 scope.go:117] "RemoveContainer" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910726 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": container with ID starting with ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1 not found: ID does not exist" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910752 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1"} err="failed to get container status \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": rpc error: code = NotFound desc = could not find container \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": container with ID starting with ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1 not found: ID does not exist" Mar 08 21:31:23 crc kubenswrapper[4885]: I0308 21:31:23.385865 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" path="/var/lib/kubelet/pods/f11798c6-04c3-4e9d-a01b-998f5e3c1e93/volumes" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-utilities" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163894 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-utilities" Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163938 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-content" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163944 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-content" Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163958 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163963 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.164151 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.164812 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168087 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168254 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168509 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.183239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.253490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.355944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.373354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.483855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:01 crc kubenswrapper[4885]: I0308 21:32:01.013872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:01 crc kubenswrapper[4885]: I0308 21:32:01.316700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerStarted","Data":"1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4"} Mar 08 21:32:02 crc kubenswrapper[4885]: I0308 21:32:02.328026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerStarted","Data":"3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964"} Mar 08 21:32:02 crc kubenswrapper[4885]: I0308 21:32:02.358895 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" podStartSLOduration=1.427598881 podStartE2EDuration="2.358875797s" podCreationTimestamp="2026-03-08 21:32:00 +0000 UTC" firstStartedPulling="2026-03-08 21:32:01.02719945 +0000 UTC m=+7222.423253513" lastFinishedPulling="2026-03-08 21:32:01.958476366 +0000 UTC m=+7223.354530429" observedRunningTime="2026-03-08 21:32:02.347259698 +0000 UTC m=+7223.743313731" watchObservedRunningTime="2026-03-08 21:32:02.358875797 +0000 UTC m=+7223.754929830" Mar 08 21:32:03 crc kubenswrapper[4885]: I0308 21:32:03.351315 4885 generic.go:334] "Generic (PLEG): container finished" podID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerID="3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964" exitCode=0 Mar 08 21:32:03 crc kubenswrapper[4885]: I0308 21:32:03.351454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerDied","Data":"3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964"} Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.776373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.868746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"244b80f2-9a2b-4db4-a451-086baed68f2a\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.880192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n" (OuterVolumeSpecName: "kube-api-access-24b7n") pod "244b80f2-9a2b-4db4-a451-086baed68f2a" (UID: "244b80f2-9a2b-4db4-a451-086baed68f2a"). InnerVolumeSpecName "kube-api-access-24b7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.971519 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerDied","Data":"1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4"} Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390748 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390844 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.436765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.449801 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:32:07 crc kubenswrapper[4885]: I0308 21:32:07.400284 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" path="/var/lib/kubelet/pods/e5e490c3-347d-4c4b-aa26-3f680e0bebc0/volumes" Mar 08 21:32:12 crc kubenswrapper[4885]: I0308 21:32:12.052505 4885 scope.go:117] "RemoveContainer" containerID="a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637" Mar 08 21:32:30 crc kubenswrapper[4885]: I0308 21:32:30.705631 4885 generic.go:334] "Generic (PLEG): container finished" podID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerID="406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac" exitCode=0 Mar 08 21:32:30 crc kubenswrapper[4885]: I0308 21:32:30.705700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerDied","Data":"406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac"} Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.259655 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389984 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.390026 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.398206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph" (OuterVolumeSpecName: "ceph") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.399182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn" (OuterVolumeSpecName: "kube-api-access-gh6cn") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "kube-api-access-gh6cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.399347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.421954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory" (OuterVolumeSpecName: "inventory") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.444093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492780 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492835 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492854 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492872 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492889 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736470 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerDied","Data":"402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced"} Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736810 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736566 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.818109 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.818205 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.865198 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:32 crc kubenswrapper[4885]: E0308 21:32:32.866010 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866040 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: E0308 21:32:32.866087 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866104 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866550 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.867887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871804 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871868 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.872604 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.872770 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.011823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012213 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114905 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.121187 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.121344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.126405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.141101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.198068 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.869540 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.761428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerStarted","Data":"f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd"} Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.762031 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerStarted","Data":"8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e"} Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.783896 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" podStartSLOduration=2.250114322 podStartE2EDuration="2.783874815s" podCreationTimestamp="2026-03-08 21:32:32 +0000 UTC" firstStartedPulling="2026-03-08 21:32:33.879991096 +0000 UTC m=+7255.276045129" lastFinishedPulling="2026-03-08 21:32:34.413751559 +0000 UTC m=+7255.809805622" observedRunningTime="2026-03-08 21:32:34.780772563 +0000 UTC m=+7256.176826596" watchObservedRunningTime="2026-03-08 21:32:34.783874815 +0000 UTC m=+7256.179928848" Mar 08 21:33:02 crc kubenswrapper[4885]: I0308 21:33:02.817944 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:33:02 crc kubenswrapper[4885]: I0308 21:33:02.818512 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.818838 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.819687 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.819860 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.825310 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.825438 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" gracePeriod=600 Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.479772 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" exitCode=0 Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.479854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.480241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.480266 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.147574 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.150671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154714 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154831 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154841 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.159963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.241228 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.343811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.374240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.484569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: W0308 21:34:00.965730 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a860a0_84ad_49b7_8596_05521c33108a.slice/crio-5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a WatchSource:0}: Error finding container 5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a: Status 404 returned error can't find the container with id 5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.967143 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:01 crc kubenswrapper[4885]: I0308 21:34:01.913198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerStarted","Data":"5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a"} Mar 08 21:34:02 crc kubenswrapper[4885]: I0308 21:34:02.929756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerStarted","Data":"6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9"} Mar 08 21:34:02 crc kubenswrapper[4885]: I0308 21:34:02.965645 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" podStartSLOduration=1.6333635480000002 podStartE2EDuration="2.965615244s" podCreationTimestamp="2026-03-08 21:34:00 +0000 UTC" firstStartedPulling="2026-03-08 21:34:00.972393367 +0000 UTC m=+7342.368447390" lastFinishedPulling="2026-03-08 21:34:02.304645033 +0000 UTC m=+7343.700699086" observedRunningTime="2026-03-08 21:34:02.946683569 +0000 UTC m=+7344.342737622" watchObservedRunningTime="2026-03-08 21:34:02.965615244 +0000 UTC m=+7344.361669307" Mar 08 21:34:03 crc kubenswrapper[4885]: I0308 21:34:03.949001 4885 generic.go:334] "Generic (PLEG): container finished" podID="c2a860a0-84ad-49b7-8596-05521c33108a" containerID="6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9" exitCode=0 Mar 08 21:34:03 crc kubenswrapper[4885]: I0308 21:34:03.949310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerDied","Data":"6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9"} Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.382068 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.499315 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"c2a860a0-84ad-49b7-8596-05521c33108a\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.504830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n" (OuterVolumeSpecName: "kube-api-access-xzp7n") pod "c2a860a0-84ad-49b7-8596-05521c33108a" (UID: "c2a860a0-84ad-49b7-8596-05521c33108a"). InnerVolumeSpecName "kube-api-access-xzp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.602982 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012444 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerDied","Data":"5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a"} Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012495 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012524 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.048516 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.054844 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:34:07 crc kubenswrapper[4885]: I0308 21:34:07.381174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" path="/var/lib/kubelet/pods/ec092c86-e0c8-415e-bfe7-80914fe8ce5b/volumes" Mar 08 21:34:12 crc kubenswrapper[4885]: I0308 21:34:12.187982 4885 scope.go:117] "RemoveContainer" containerID="38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd" Mar 08 21:34:42 crc kubenswrapper[4885]: I0308 21:34:42.438301 4885 generic.go:334] "Generic (PLEG): container finished" podID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerID="f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd" exitCode=0 Mar 08 21:34:42 crc kubenswrapper[4885]: I0308 21:34:42.438589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerDied","Data":"f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd"} Mar 08 21:34:43 crc kubenswrapper[4885]: I0308 21:34:43.921470 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023635 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.034209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph" (OuterVolumeSpecName: "ceph") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.034354 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg" (OuterVolumeSpecName: "kube-api-access-vmwsg") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "kube-api-access-vmwsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.069527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.074136 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory" (OuterVolumeSpecName: "inventory") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.125961 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.125998 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.126013 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.126025 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerDied","Data":"8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e"} Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465295 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465310 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577069 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:44 crc kubenswrapper[4885]: E0308 21:34:44.577789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: E0308 21:34:44.577836 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577845 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.578247 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.578267 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.579134 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.582937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583298 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.603531 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638616 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740729 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.745892 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.746278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.746633 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.760749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.906514 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:45 crc kubenswrapper[4885]: I0308 21:34:45.554859 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.109085 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.524889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerStarted","Data":"f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806"} Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.525208 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerStarted","Data":"e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac"} Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.557600 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" podStartSLOduration=2.017231939 podStartE2EDuration="2.557582003s" podCreationTimestamp="2026-03-08 21:34:44 +0000 UTC" firstStartedPulling="2026-03-08 21:34:45.564507174 +0000 UTC m=+7386.960561197" lastFinishedPulling="2026-03-08 21:34:46.104857198 +0000 UTC m=+7387.500911261" observedRunningTime="2026-03-08 21:34:46.556505885 +0000 UTC m=+7387.952559918" watchObservedRunningTime="2026-03-08 21:34:46.557582003 +0000 UTC m=+7387.953636026" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.146480 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.148362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.150261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.152453 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.152465 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.159973 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.241693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.343354 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.364598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.474095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.992521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:01 crc kubenswrapper[4885]: I0308 21:36:01.408076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerStarted","Data":"3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8"} Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.421667 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerStarted","Data":"59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f"} Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.439968 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" podStartSLOduration=1.5337660130000002 podStartE2EDuration="2.439948205s" podCreationTimestamp="2026-03-08 21:36:00 +0000 UTC" firstStartedPulling="2026-03-08 21:36:00.993492682 +0000 UTC m=+7462.389546745" lastFinishedPulling="2026-03-08 21:36:01.899674904 +0000 UTC m=+7463.295728937" observedRunningTime="2026-03-08 21:36:02.43525746 +0000 UTC m=+7463.831311503" watchObservedRunningTime="2026-03-08 21:36:02.439948205 +0000 UTC m=+7463.836002238" Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.818065 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.818162 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:36:03 crc kubenswrapper[4885]: I0308 21:36:03.436967 4885 generic.go:334] "Generic (PLEG): container finished" podID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerID="59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f" exitCode=0 Mar 08 21:36:03 crc kubenswrapper[4885]: I0308 21:36:03.437029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerDied","Data":"59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f"} Mar 08 21:36:04 crc kubenswrapper[4885]: I0308 21:36:04.977275 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.069811 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"44e60165-e38f-4fbe-87a1-5908598e0e38\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.075762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9" (OuterVolumeSpecName: "kube-api-access-tswn9") pod "44e60165-e38f-4fbe-87a1-5908598e0e38" (UID: "44e60165-e38f-4fbe-87a1-5908598e0e38"). InnerVolumeSpecName "kube-api-access-tswn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.172469 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerDied","Data":"3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8"} Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460464 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460515 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.535104 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.548690 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:36:06 crc kubenswrapper[4885]: I0308 21:36:06.475225 4885 generic.go:334] "Generic (PLEG): container finished" podID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerID="f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806" exitCode=0 Mar 08 21:36:06 crc kubenswrapper[4885]: I0308 21:36:06.475283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerDied","Data":"f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806"} Mar 08 21:36:07 crc kubenswrapper[4885]: I0308 21:36:07.389090 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" path="/var/lib/kubelet/pods/ab193869-7f8c-4475-8be4-393848bd54e3/volumes" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.065849 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246638 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.247390 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.253061 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb" (OuterVolumeSpecName: "kube-api-access-dh2jb") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "kube-api-access-dh2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.253295 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph" (OuterVolumeSpecName: "ceph") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.290239 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.301400 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory" (OuterVolumeSpecName: "inventory") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349611 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349641 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349650 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349659 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerDied","Data":"e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac"} Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497496 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.597680 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:08 crc kubenswrapper[4885]: E0308 21:36:08.609961 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.609991 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: E0308 21:36:08.610012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610021 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610406 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610428 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.611284 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.611377 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615056 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615357 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.763014 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.865994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.873258 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.875019 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.891368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.896089 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.953548 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:09 crc kubenswrapper[4885]: I0308 21:36:09.553328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.521503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerStarted","Data":"304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e"} Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.521959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerStarted","Data":"a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552"} Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.548041 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" podStartSLOduration=2.107426309 podStartE2EDuration="2.548021221s" podCreationTimestamp="2026-03-08 21:36:08 +0000 UTC" firstStartedPulling="2026-03-08 21:36:09.555084386 +0000 UTC m=+7470.951138419" lastFinishedPulling="2026-03-08 21:36:09.995679268 +0000 UTC m=+7471.391733331" observedRunningTime="2026-03-08 21:36:10.538304212 +0000 UTC m=+7471.934358255" watchObservedRunningTime="2026-03-08 21:36:10.548021221 +0000 UTC m=+7471.944075254" Mar 08 21:36:12 crc kubenswrapper[4885]: I0308 21:36:12.361580 4885 scope.go:117] "RemoveContainer" containerID="7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700" Mar 08 21:36:15 crc kubenswrapper[4885]: I0308 21:36:15.583747 4885 generic.go:334] "Generic (PLEG): container finished" podID="df77d68a-3570-49fb-958b-c358543e661f" containerID="304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e" exitCode=0 Mar 08 21:36:15 crc kubenswrapper[4885]: I0308 21:36:15.583881 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerDied","Data":"304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e"} Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.110734 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.167907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168016 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168039 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168142 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.174059 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz" (OuterVolumeSpecName: "kube-api-access-grmfz") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "kube-api-access-grmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.175155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph" (OuterVolumeSpecName: "ceph") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.203735 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.206283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory" (OuterVolumeSpecName: "inventory") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274081 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274132 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274173 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274185 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerDied","Data":"a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552"} Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601899 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601599 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.692165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:17 crc kubenswrapper[4885]: E0308 21:36:17.692997 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.693030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.693393 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.694660 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.696827 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.697507 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.703071 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.704719 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.704758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.783943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784255 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886070 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886253 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.907098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.018600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.626589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.639062 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.631793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerStarted","Data":"240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3"} Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.632523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerStarted","Data":"1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d"} Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.659079 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-mj45h" podStartSLOduration=2.239729067 podStartE2EDuration="2.659064112s" podCreationTimestamp="2026-03-08 21:36:17 +0000 UTC" firstStartedPulling="2026-03-08 21:36:18.638501179 +0000 UTC m=+7480.034555242" lastFinishedPulling="2026-03-08 21:36:19.057836264 +0000 UTC m=+7480.453890287" observedRunningTime="2026-03-08 21:36:19.658063655 +0000 UTC m=+7481.054117708" watchObservedRunningTime="2026-03-08 21:36:19.659064112 +0000 UTC m=+7481.055118125" Mar 08 21:36:32 crc kubenswrapper[4885]: I0308 21:36:32.819024 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:36:32 crc kubenswrapper[4885]: I0308 21:36:32.819689 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.818337 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.820608 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.820756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.821901 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.822145 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" gracePeriod=600 Mar 08 21:37:02 crc kubenswrapper[4885]: E0308 21:37:02.953513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216478 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" exitCode=0 Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216566 4885 scope.go:117] "RemoveContainer" containerID="eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.217288 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:03 crc kubenswrapper[4885]: E0308 21:37:03.217616 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:05 crc kubenswrapper[4885]: I0308 21:37:05.254844 4885 generic.go:334] "Generic (PLEG): container finished" podID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerID="240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3" exitCode=0 Mar 08 21:37:05 crc kubenswrapper[4885]: I0308 21:37:05.255159 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerDied","Data":"240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3"} Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.766653 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865873 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.866065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.872008 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph" (OuterVolumeSpecName: "ceph") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.876193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6" (OuterVolumeSpecName: "kube-api-access-6jlv6") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "kube-api-access-6jlv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.908614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.914660 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory" (OuterVolumeSpecName: "inventory") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968195 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968240 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968262 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968280 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerDied","Data":"1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d"} Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279610 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279405 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.382840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:07 crc kubenswrapper[4885]: E0308 21:37:07.383276 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.383298 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.383656 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.384768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.387671 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.387857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.388122 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.388299 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.390166 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477668 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.581563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.581839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.582002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.582064 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.588563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.588822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.590152 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.608964 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.709814 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:08 crc kubenswrapper[4885]: I0308 21:37:08.340264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:08 crc kubenswrapper[4885]: W0308 21:37:08.350048 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa3ffc5_e09f_48b4_96b2_e2454bfe6251.slice/crio-38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16 WatchSource:0}: Error finding container 38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16: Status 404 returned error can't find the container with id 38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16 Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.304544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerStarted","Data":"2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8"} Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.305198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerStarted","Data":"38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16"} Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.340100 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" podStartSLOduration=1.8077163619999999 podStartE2EDuration="2.340081852s" podCreationTimestamp="2026-03-08 21:37:07 +0000 UTC" firstStartedPulling="2026-03-08 21:37:08.356348012 +0000 UTC m=+7529.752402035" lastFinishedPulling="2026-03-08 21:37:08.888713482 +0000 UTC m=+7530.284767525" observedRunningTime="2026-03-08 21:37:09.334578065 +0000 UTC m=+7530.730632108" watchObservedRunningTime="2026-03-08 21:37:09.340081852 +0000 UTC m=+7530.736135875" Mar 08 21:37:18 crc kubenswrapper[4885]: I0308 21:37:18.369026 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:18 crc kubenswrapper[4885]: E0308 21:37:18.370231 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:33 crc kubenswrapper[4885]: I0308 21:37:33.369527 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:33 crc kubenswrapper[4885]: E0308 21:37:33.370814 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:48 crc kubenswrapper[4885]: I0308 21:37:48.371153 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:48 crc kubenswrapper[4885]: E0308 21:37:48.373634 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:56 crc kubenswrapper[4885]: I0308 21:37:56.908309 4885 generic.go:334] "Generic (PLEG): container finished" podID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerID="2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8" exitCode=0 Mar 08 21:37:56 crc kubenswrapper[4885]: I0308 21:37:56.908423 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerDied","Data":"2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8"} Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.516728 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555611 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.561987 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph" (OuterVolumeSpecName: "ceph") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.566162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l" (OuterVolumeSpecName: "kube-api-access-2xt2l") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "kube-api-access-2xt2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.593387 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory" (OuterVolumeSpecName: "inventory") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.610127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659288 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659324 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659339 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659355 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerDied","Data":"38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16"} Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952408 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.063708 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:37:59 crc kubenswrapper[4885]: E0308 21:37:59.064241 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.064259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.064457 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.066496 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.070407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.070656 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.075301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.075608 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.079724 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269481 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.275533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.279143 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.279312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.285792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.383868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.998410 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:38:00 crc kubenswrapper[4885]: W0308 21:38:00.010045 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef3d518_c413_4129_b022_dffb097239b2.slice/crio-4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea WatchSource:0}: Error finding container 4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea: Status 404 returned error can't find the container with id 4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.132890 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.134768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137105 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137417 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.148528 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.195838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.299262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.320145 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.368398 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:00 crc kubenswrapper[4885]: E0308 21:38:00.368667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.457480 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.997252 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerStarted","Data":"afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b"} Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.997916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerStarted","Data":"4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea"} Mar 08 21:38:01 crc kubenswrapper[4885]: I0308 21:38:01.022434 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-khtfv" podStartSLOduration=1.533455183 podStartE2EDuration="2.022414216s" podCreationTimestamp="2026-03-08 21:37:59 +0000 UTC" firstStartedPulling="2026-03-08 21:38:00.014163531 +0000 UTC m=+7581.410217564" lastFinishedPulling="2026-03-08 21:38:00.503122534 +0000 UTC m=+7581.899176597" observedRunningTime="2026-03-08 21:38:01.018755068 +0000 UTC m=+7582.414809101" watchObservedRunningTime="2026-03-08 21:38:01.022414216 +0000 UTC m=+7582.418468249" Mar 08 21:38:01 crc kubenswrapper[4885]: W0308 21:38:01.046543 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c51bcd_c065_4fa7_8318_0d0704836166.slice/crio-1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742 WatchSource:0}: Error finding container 1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742: Status 404 returned error can't find the container with id 1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742 Mar 08 21:38:01 crc kubenswrapper[4885]: I0308 21:38:01.049645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:02 crc kubenswrapper[4885]: I0308 21:38:02.008195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerStarted","Data":"1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742"} Mar 08 21:38:03 crc kubenswrapper[4885]: I0308 21:38:03.020968 4885 generic.go:334] "Generic (PLEG): container finished" podID="67c51bcd-c065-4fa7-8318-0d0704836166" containerID="2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7" exitCode=0 Mar 08 21:38:03 crc kubenswrapper[4885]: I0308 21:38:03.021112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerDied","Data":"2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7"} Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.454133 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.510762 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"67c51bcd-c065-4fa7-8318-0d0704836166\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.517203 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs" (OuterVolumeSpecName: "kube-api-access-4j9qs") pod "67c51bcd-c065-4fa7-8318-0d0704836166" (UID: "67c51bcd-c065-4fa7-8318-0d0704836166"). InnerVolumeSpecName "kube-api-access-4j9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.613737 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerDied","Data":"1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742"} Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044563 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044570 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.552425 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.562301 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:38:07 crc kubenswrapper[4885]: I0308 21:38:07.397820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" path="/var/lib/kubelet/pods/244b80f2-9a2b-4db4-a451-086baed68f2a/volumes" Mar 08 21:38:10 crc kubenswrapper[4885]: I0308 21:38:10.098126 4885 generic.go:334] "Generic (PLEG): container finished" podID="bef3d518-c413-4129-b022-dffb097239b2" containerID="afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b" exitCode=0 Mar 08 21:38:10 crc kubenswrapper[4885]: I0308 21:38:10.098224 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerDied","Data":"afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b"} Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.629139 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686067 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.692364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph" (OuterVolumeSpecName: "ceph") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.706191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7" (OuterVolumeSpecName: "kube-api-access-qj6h7") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "kube-api-access-qj6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.721725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.733428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792547 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792588 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792602 4885 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792615 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.127724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerDied","Data":"4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea"} Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.128065 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.127843 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.208582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.209055 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209073 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.209115 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209123 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209316 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209352 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.210096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.213823 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.213828 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.214485 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.215307 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.234869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.368199 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.368485 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406560 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406761 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.407079 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.509070 4885 scope.go:117] "RemoveContainer" containerID="3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.510133 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.510247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.512432 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.513083 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.516613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.517153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.523399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.541323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.835461 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:13 crc kubenswrapper[4885]: I0308 21:38:13.498980 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:14 crc kubenswrapper[4885]: I0308 21:38:14.163662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerStarted","Data":"1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6"} Mar 08 21:38:15 crc kubenswrapper[4885]: I0308 21:38:15.178179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerStarted","Data":"5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59"} Mar 08 21:38:15 crc kubenswrapper[4885]: I0308 21:38:15.204476 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" podStartSLOduration=2.674378032 podStartE2EDuration="3.2044426s" podCreationTimestamp="2026-03-08 21:38:12 +0000 UTC" firstStartedPulling="2026-03-08 21:38:13.506219642 +0000 UTC m=+7594.902273685" lastFinishedPulling="2026-03-08 21:38:14.03628424 +0000 UTC m=+7595.432338253" observedRunningTime="2026-03-08 21:38:15.199999091 +0000 UTC m=+7596.596053124" watchObservedRunningTime="2026-03-08 21:38:15.2044426 +0000 UTC m=+7596.600496673" Mar 08 21:38:22 crc kubenswrapper[4885]: I0308 21:38:22.306714 4885 generic.go:334] "Generic (PLEG): container finished" podID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerID="5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59" exitCode=0 Mar 08 21:38:22 crc kubenswrapper[4885]: I0308 21:38:22.307051 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerDied","Data":"5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59"} Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.937224 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995365 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.004435 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph" (OuterVolumeSpecName: "ceph") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.008702 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9" (OuterVolumeSpecName: "kube-api-access-ztjs9") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "kube-api-access-ztjs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.034528 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.034690 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory" (OuterVolumeSpecName: "inventory") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098551 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098587 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098597 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098608 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.335699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerDied","Data":"1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6"} Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.336052 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.335793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.431854 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:24 crc kubenswrapper[4885]: E0308 21:38:24.432887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.432965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.433502 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.435277 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.440615 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.444663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.444686 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.448446 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.449540 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.610807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.610863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.611154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.611246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.616499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.619888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.625462 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.646546 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.764494 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:25 crc kubenswrapper[4885]: I0308 21:38:25.348792 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:25 crc kubenswrapper[4885]: W0308 21:38:25.355253 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf062e16_c6d1_4d3c_b0aa_ca00e9740bcb.slice/crio-969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3 WatchSource:0}: Error finding container 969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3: Status 404 returned error can't find the container with id 969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3 Mar 08 21:38:25 crc kubenswrapper[4885]: I0308 21:38:25.369679 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:25 crc kubenswrapper[4885]: E0308 21:38:25.370098 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.364393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerStarted","Data":"2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27"} Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.365127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerStarted","Data":"969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3"} Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.393770 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" podStartSLOduration=1.8649856919999999 podStartE2EDuration="2.393750676s" podCreationTimestamp="2026-03-08 21:38:24 +0000 UTC" firstStartedPulling="2026-03-08 21:38:25.360722021 +0000 UTC m=+7606.756776034" lastFinishedPulling="2026-03-08 21:38:25.889486995 +0000 UTC m=+7607.285541018" observedRunningTime="2026-03-08 21:38:26.386712138 +0000 UTC m=+7607.782766211" watchObservedRunningTime="2026-03-08 21:38:26.393750676 +0000 UTC m=+7607.789804699" Mar 08 21:38:37 crc kubenswrapper[4885]: I0308 21:38:37.369465 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:37 crc kubenswrapper[4885]: E0308 21:38:37.370353 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:42 crc kubenswrapper[4885]: I0308 21:38:42.562548 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerID="2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27" exitCode=0 Mar 08 21:38:42 crc kubenswrapper[4885]: I0308 21:38:42.562648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerDied","Data":"2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27"} Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.262718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293904 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.294182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.300249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph" (OuterVolumeSpecName: "ceph") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.300734 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl" (OuterVolumeSpecName: "kube-api-access-vr2rl") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "kube-api-access-vr2rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.325438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.338110 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory" (OuterVolumeSpecName: "inventory") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397389 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397429 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397442 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397454 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerDied","Data":"969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3"} Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591179 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591186 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.708300 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:44 crc kubenswrapper[4885]: E0308 21:38:44.709196 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.709218 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.709483 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.710536 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.712823 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713153 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713466 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.722793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811685 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812586 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812640 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812964 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.813051 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.813104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914303 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914338 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914381 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914433 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914463 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.920441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.921458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.921455 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.923240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.924353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.924614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.925996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.930900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.938977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.032535 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.041467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.647539 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.075448 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.621034 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerStarted","Data":"b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8"} Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.621466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerStarted","Data":"79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6"} Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.665472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-8t465" podStartSLOduration=2.2464177579999998 podStartE2EDuration="2.665443826s" podCreationTimestamp="2026-03-08 21:38:44 +0000 UTC" firstStartedPulling="2026-03-08 21:38:45.65297546 +0000 UTC m=+7627.049029483" lastFinishedPulling="2026-03-08 21:38:46.072001478 +0000 UTC m=+7627.468055551" observedRunningTime="2026-03-08 21:38:46.652273895 +0000 UTC m=+7628.048327938" watchObservedRunningTime="2026-03-08 21:38:46.665443826 +0000 UTC m=+7628.061497879" Mar 08 21:38:50 crc kubenswrapper[4885]: I0308 21:38:50.368180 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:50 crc kubenswrapper[4885]: E0308 21:38:50.369126 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:01 crc kubenswrapper[4885]: I0308 21:39:01.368872 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:01 crc kubenswrapper[4885]: E0308 21:39:01.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.322994 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.326889 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.335693 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463684 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566237 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566627 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.586129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.689799 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.910475 4885 generic.go:334] "Generic (PLEG): container finished" podID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerID="b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8" exitCode=0 Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.910855 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerDied","Data":"b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8"} Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.205949 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.921747 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e" exitCode=0 Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.921820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e"} Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.922450 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"00691bce0bb761767b57e77ad9015e8de45354bfaf62e2946b1fde8b97c48cf8"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.497603 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539060 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539129 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539233 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539284 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539309 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.544215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.545753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.552344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.555604 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.556526 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph" (OuterVolumeSpecName: "ceph") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.556562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr" (OuterVolumeSpecName: "kube-api-access-bscmr") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "kube-api-access-bscmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.559215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.559744 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.560236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.563502 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.603467 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory" (OuterVolumeSpecName: "inventory") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643403 4885 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643442 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643451 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643463 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643475 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643485 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643493 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643503 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643511 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643521 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643530 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.659865 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.745486 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.938776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerDied","Data":"79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941814 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941854 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.042039 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:09 crc kubenswrapper[4885]: E0308 21:39:09.042682 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.042783 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.043081 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.043863 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.053698 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.053937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.054465 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.054583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.062503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.156890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157109 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.260027 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.263751 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.264703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.265395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.275690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.366138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:09.992664 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:10.966293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerStarted","Data":"549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca"} Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:11.997362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerStarted","Data":"81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96"} Mar 08 21:39:12 crc kubenswrapper[4885]: I0308 21:39:12.026668 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" podStartSLOduration=1.7864100939999998 podStartE2EDuration="3.026645767s" podCreationTimestamp="2026-03-08 21:39:09 +0000 UTC" firstStartedPulling="2026-03-08 21:39:09.997262864 +0000 UTC m=+7651.393316917" lastFinishedPulling="2026-03-08 21:39:11.237498557 +0000 UTC m=+7652.633552590" observedRunningTime="2026-03-08 21:39:12.016784614 +0000 UTC m=+7653.412838627" watchObservedRunningTime="2026-03-08 21:39:12.026645767 +0000 UTC m=+7653.422699790" Mar 08 21:39:14 crc kubenswrapper[4885]: I0308 21:39:14.368773 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:14 crc kubenswrapper[4885]: E0308 21:39:14.369707 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:17 crc kubenswrapper[4885]: I0308 21:39:17.103740 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerID="81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96" exitCode=0 Mar 08 21:39:17 crc kubenswrapper[4885]: I0308 21:39:17.103871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerDied","Data":"81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96"} Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.601555 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.739948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740303 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740345 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.746903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks" (OuterVolumeSpecName: "kube-api-access-796ks") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "kube-api-access-796ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.747995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph" (OuterVolumeSpecName: "ceph") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.777344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory" (OuterVolumeSpecName: "inventory") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.778624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842622 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842659 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842671 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842687 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.121942 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerDied","Data":"549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca"} Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.122276 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.121983 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.204996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:19 crc kubenswrapper[4885]: E0308 21:39:19.205433 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.205451 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.205658 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.206366 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.208942 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209009 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209128 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209761 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.215286 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.221197 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350842 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350891 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.351004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.351047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.452903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453127 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453219 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.454913 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.457600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.460165 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.461145 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.461876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.474573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.529721 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:20 crc kubenswrapper[4885]: I0308 21:39:20.149873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.143202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerStarted","Data":"1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.143649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerStarted","Data":"f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.147535 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97" exitCode=0 Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.147574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.204429 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" podStartSLOduration=1.7192958470000002 podStartE2EDuration="2.204407006s" podCreationTimestamp="2026-03-08 21:39:19 +0000 UTC" firstStartedPulling="2026-03-08 21:39:20.1646049 +0000 UTC m=+7661.560658933" lastFinishedPulling="2026-03-08 21:39:20.649716069 +0000 UTC m=+7662.045770092" observedRunningTime="2026-03-08 21:39:21.171531618 +0000 UTC m=+7662.567585641" watchObservedRunningTime="2026-03-08 21:39:21.204407006 +0000 UTC m=+7662.600461039" Mar 08 21:39:22 crc kubenswrapper[4885]: I0308 21:39:22.163858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14"} Mar 08 21:39:22 crc kubenswrapper[4885]: I0308 21:39:22.196783 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qwl7" podStartSLOduration=2.46678899 podStartE2EDuration="16.196760755s" podCreationTimestamp="2026-03-08 21:39:06 +0000 UTC" firstStartedPulling="2026-03-08 21:39:07.923793626 +0000 UTC m=+7649.319847669" lastFinishedPulling="2026-03-08 21:39:21.653765371 +0000 UTC m=+7663.049819434" observedRunningTime="2026-03-08 21:39:22.184734135 +0000 UTC m=+7663.580788198" watchObservedRunningTime="2026-03-08 21:39:22.196760755 +0000 UTC m=+7663.592814788" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.368588 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:26 crc kubenswrapper[4885]: E0308 21:39:26.369489 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.690169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.690257 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:27 crc kubenswrapper[4885]: I0308 21:39:27.764605 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" probeResult="failure" output=< Mar 08 21:39:27 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:39:27 crc kubenswrapper[4885]: > Mar 08 21:39:37 crc kubenswrapper[4885]: I0308 21:39:37.743726 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" probeResult="failure" output=< Mar 08 21:39:37 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:39:37 crc kubenswrapper[4885]: > Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.368363 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:41 crc kubenswrapper[4885]: E0308 21:39:41.369077 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.638426 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.642258 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.676971 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680340 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781520 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.782463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.782755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.804142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.974115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:42 crc kubenswrapper[4885]: I0308 21:39:42.567726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.410999 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" exitCode=0 Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.411720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254"} Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.411805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"ea86a8b7754eeae84af3d80c0bdfbba0a325e85525e457e4c2467add29827e9c"} Mar 08 21:39:44 crc kubenswrapper[4885]: I0308 21:39:44.437966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.465666 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" exitCode=0 Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.465782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.782352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.874756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:47 crc kubenswrapper[4885]: I0308 21:39:47.479450 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} Mar 08 21:39:47 crc kubenswrapper[4885]: I0308 21:39:47.508803 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bspm" podStartSLOduration=3.013178101 podStartE2EDuration="6.508784734s" podCreationTimestamp="2026-03-08 21:39:41 +0000 UTC" firstStartedPulling="2026-03-08 21:39:43.41927684 +0000 UTC m=+7684.815330893" lastFinishedPulling="2026-03-08 21:39:46.914883463 +0000 UTC m=+7688.310937526" observedRunningTime="2026-03-08 21:39:47.500234546 +0000 UTC m=+7688.896288569" watchObservedRunningTime="2026-03-08 21:39:47.508784734 +0000 UTC m=+7688.904838757" Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.215036 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.215510 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" containerID="cri-o://ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" gracePeriod=2 Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.510035 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" exitCode=0 Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.510200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14"} Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.928220 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023531 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.024048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities" (OuterVolumeSpecName: "utilities") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.024325 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.029639 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf" (OuterVolumeSpecName: "kube-api-access-8nwhf") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "kube-api-access-8nwhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.126582 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.139442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.228525 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"00691bce0bb761767b57e77ad9015e8de45354bfaf62e2946b1fde8b97c48cf8"} Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524422 4885 scope.go:117] "RemoveContainer" containerID="ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.584370 4885 scope.go:117] "RemoveContainer" containerID="9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.588787 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.604808 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.621874 4885 scope.go:117] "RemoveContainer" containerID="a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.382270 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" path="/var/lib/kubelet/pods/ebd3b0b8-3fd7-4386-b558-e083d8665e9e/volumes" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.974480 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.974539 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:52 crc kubenswrapper[4885]: I0308 21:39:52.045687 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:52 crc kubenswrapper[4885]: I0308 21:39:52.648634 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:53 crc kubenswrapper[4885]: I0308 21:39:53.368385 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:53 crc kubenswrapper[4885]: E0308 21:39:53.368656 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:53 crc kubenswrapper[4885]: I0308 21:39:53.422847 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:54 crc kubenswrapper[4885]: I0308 21:39:54.570065 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bspm" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" containerID="cri-o://d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" gracePeriod=2 Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.169659 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.266149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities" (OuterVolumeSpecName: "utilities") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.276227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g" (OuterVolumeSpecName: "kube-api-access-plh2g") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "kube-api-access-plh2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.347779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366307 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366362 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366378 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.582740 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" exitCode=0 Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"ea86a8b7754eeae84af3d80c0bdfbba0a325e85525e457e4c2467add29827e9c"} Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583209 4885 scope.go:117] "RemoveContainer" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.608237 4885 scope.go:117] "RemoveContainer" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.626449 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.640910 4885 scope.go:117] "RemoveContainer" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.644292 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.703319 4885 scope.go:117] "RemoveContainer" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.704423 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": container with ID starting with d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d not found: ID does not exist" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704489 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} err="failed to get container status \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": rpc error: code = NotFound desc = could not find container \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": container with ID starting with d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d not found: ID does not exist" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704529 4885 scope.go:117] "RemoveContainer" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.704862 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": container with ID starting with 19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77 not found: ID does not exist" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704899 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} err="failed to get container status \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": rpc error: code = NotFound desc = could not find container \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": container with ID starting with 19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77 not found: ID does not exist" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704944 4885 scope.go:117] "RemoveContainer" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.705241 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": container with ID starting with 6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254 not found: ID does not exist" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.705266 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254"} err="failed to get container status \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": rpc error: code = NotFound desc = could not find container \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": container with ID starting with 6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254 not found: ID does not exist" Mar 08 21:39:57 crc kubenswrapper[4885]: I0308 21:39:57.386522 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" path="/var/lib/kubelet/pods/76335e5d-7b7b-457d-a69c-90318e5cbbb4/volumes" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155071 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155914 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155938 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155959 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155968 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155979 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155987 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155996 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156003 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.156029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156037 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156319 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156357 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.157247 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.159853 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.160163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.160364 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.168743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.182103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.284629 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.304949 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.528403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:01 crc kubenswrapper[4885]: I0308 21:40:01.019169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:01 crc kubenswrapper[4885]: I0308 21:40:01.669471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerStarted","Data":"4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf"} Mar 08 21:40:02 crc kubenswrapper[4885]: I0308 21:40:02.690999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerStarted","Data":"51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41"} Mar 08 21:40:02 crc kubenswrapper[4885]: I0308 21:40:02.714506 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550100-8r628" podStartSLOduration=1.506346207 podStartE2EDuration="2.714485693s" podCreationTimestamp="2026-03-08 21:40:00 +0000 UTC" firstStartedPulling="2026-03-08 21:40:01.025012898 +0000 UTC m=+7702.421066921" lastFinishedPulling="2026-03-08 21:40:02.233152384 +0000 UTC m=+7703.629206407" observedRunningTime="2026-03-08 21:40:02.70238994 +0000 UTC m=+7704.098443993" watchObservedRunningTime="2026-03-08 21:40:02.714485693 +0000 UTC m=+7704.110539726" Mar 08 21:40:03 crc kubenswrapper[4885]: I0308 21:40:03.705154 4885 generic.go:334] "Generic (PLEG): container finished" podID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerID="51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41" exitCode=0 Mar 08 21:40:03 crc kubenswrapper[4885]: I0308 21:40:03.705228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerDied","Data":"51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41"} Mar 08 21:40:04 crc kubenswrapper[4885]: I0308 21:40:04.369006 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:04 crc kubenswrapper[4885]: E0308 21:40:04.369735 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.133944 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.214249 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"70f4584f-eeba-4b88-b31d-79a39f062bd3\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.219303 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd" (OuterVolumeSpecName: "kube-api-access-vlffd") pod "70f4584f-eeba-4b88-b31d-79a39f062bd3" (UID: "70f4584f-eeba-4b88-b31d-79a39f062bd3"). InnerVolumeSpecName "kube-api-access-vlffd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.317396 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.728771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerDied","Data":"4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf"} Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.729150 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.728868 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.788079 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.798063 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:40:07 crc kubenswrapper[4885]: I0308 21:40:07.385027 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" path="/var/lib/kubelet/pods/c2a860a0-84ad-49b7-8596-05521c33108a/volumes" Mar 08 21:40:12 crc kubenswrapper[4885]: I0308 21:40:12.782749 4885 scope.go:117] "RemoveContainer" containerID="6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9" Mar 08 21:40:16 crc kubenswrapper[4885]: I0308 21:40:16.370579 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:16 crc kubenswrapper[4885]: E0308 21:40:16.371414 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:28 crc kubenswrapper[4885]: I0308 21:40:28.368863 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:28 crc kubenswrapper[4885]: E0308 21:40:28.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:29 crc kubenswrapper[4885]: I0308 21:40:29.161692 4885 generic.go:334] "Generic (PLEG): container finished" podID="ba3efa94-310a-4c53-ac95-2444759b8574" containerID="1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d" exitCode=0 Mar 08 21:40:29 crc kubenswrapper[4885]: I0308 21:40:29.161846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerDied","Data":"1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d"} Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.616793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711312 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711408 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.716888 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n" (OuterVolumeSpecName: "kube-api-access-b4f5n") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "kube-api-access-b4f5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.717083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph" (OuterVolumeSpecName: "ceph") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.721054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.738186 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.742792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory" (OuterVolumeSpecName: "inventory") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.746349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813545 4885 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813586 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813600 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813615 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813626 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813637 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184101 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerDied","Data":"f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e"} Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184159 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184222 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.345452 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:31 crc kubenswrapper[4885]: E0308 21:40:31.346171 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346255 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: E0308 21:40:31.346367 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346432 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346737 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346824 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.347742 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.350500 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.350789 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.351367 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.352866 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.353207 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.357062 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.360479 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429240 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532605 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.537176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.538883 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.539516 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.540885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.541395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.541576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.555141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.677178 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:32 crc kubenswrapper[4885]: I0308 21:40:32.353274 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:33 crc kubenswrapper[4885]: I0308 21:40:33.204032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerStarted","Data":"fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29"} Mar 08 21:40:34 crc kubenswrapper[4885]: I0308 21:40:34.219675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerStarted","Data":"46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f"} Mar 08 21:40:34 crc kubenswrapper[4885]: I0308 21:40:34.256055 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" podStartSLOduration=2.764488997 podStartE2EDuration="3.256031279s" podCreationTimestamp="2026-03-08 21:40:31 +0000 UTC" firstStartedPulling="2026-03-08 21:40:32.358092772 +0000 UTC m=+7733.754146795" lastFinishedPulling="2026-03-08 21:40:32.849635054 +0000 UTC m=+7734.245689077" observedRunningTime="2026-03-08 21:40:34.249669509 +0000 UTC m=+7735.645723562" watchObservedRunningTime="2026-03-08 21:40:34.256031279 +0000 UTC m=+7735.652085332" Mar 08 21:40:43 crc kubenswrapper[4885]: I0308 21:40:43.369353 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:43 crc kubenswrapper[4885]: E0308 21:40:43.370403 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:58 crc kubenswrapper[4885]: I0308 21:40:58.369015 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:58 crc kubenswrapper[4885]: E0308 21:40:58.370044 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:09 crc kubenswrapper[4885]: I0308 21:41:09.406103 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:09 crc kubenswrapper[4885]: E0308 21:41:09.410345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:23 crc kubenswrapper[4885]: I0308 21:41:23.369775 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:23 crc kubenswrapper[4885]: E0308 21:41:23.371011 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:26 crc kubenswrapper[4885]: I0308 21:41:26.872985 4885 generic.go:334] "Generic (PLEG): container finished" podID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerID="46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f" exitCode=0 Mar 08 21:41:26 crc kubenswrapper[4885]: I0308 21:41:26.873022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerDied","Data":"46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f"} Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.428497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551328 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551498 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551599 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551625 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.560645 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.560713 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph" (OuterVolumeSpecName: "ceph") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.563331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx" (OuterVolumeSpecName: "kube-api-access-t9mhx") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "kube-api-access-t9mhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.582347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.603420 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory" (OuterVolumeSpecName: "inventory") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.612337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.614441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654179 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654217 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654236 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654250 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654263 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654274 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654285 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.902651 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerDied","Data":"fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29"} Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.902701 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.903104 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.048214 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:29 crc kubenswrapper[4885]: E0308 21:41:29.048812 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.048845 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.049201 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.050215 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053390 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053834 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.054658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.055625 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.060650 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.286748 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.286947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287166 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.292900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.293344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.293893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.294823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.294884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.319208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.373168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.046523 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.050112 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.944539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerStarted","Data":"135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b"} Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.945054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerStarted","Data":"2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b"} Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.990798 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" podStartSLOduration=1.54456602 podStartE2EDuration="1.990769752s" podCreationTimestamp="2026-03-08 21:41:29 +0000 UTC" firstStartedPulling="2026-03-08 21:41:30.046220797 +0000 UTC m=+7791.442274820" lastFinishedPulling="2026-03-08 21:41:30.492424489 +0000 UTC m=+7791.888478552" observedRunningTime="2026-03-08 21:41:30.971400415 +0000 UTC m=+7792.367454478" watchObservedRunningTime="2026-03-08 21:41:30.990769752 +0000 UTC m=+7792.386823805" Mar 08 21:41:35 crc kubenswrapper[4885]: I0308 21:41:35.368133 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:35 crc kubenswrapper[4885]: E0308 21:41:35.369207 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:46 crc kubenswrapper[4885]: I0308 21:41:46.369330 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:46 crc kubenswrapper[4885]: E0308 21:41:46.370515 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.155891 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.158161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.176355 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.187871 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.187974 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.188187 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.203708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.305759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.335158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.368214 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:42:00 crc kubenswrapper[4885]: E0308 21:42:00.368591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.510665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:01 crc kubenswrapper[4885]: I0308 21:42:01.067393 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:01 crc kubenswrapper[4885]: I0308 21:42:01.393511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerStarted","Data":"bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18"} Mar 08 21:42:03 crc kubenswrapper[4885]: I0308 21:42:03.422130 4885 generic.go:334] "Generic (PLEG): container finished" podID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerID="353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5" exitCode=0 Mar 08 21:42:03 crc kubenswrapper[4885]: I0308 21:42:03.422190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerDied","Data":"353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5"} Mar 08 21:42:04 crc kubenswrapper[4885]: I0308 21:42:04.888794 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.033712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.041165 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf" (OuterVolumeSpecName: "kube-api-access-9n6mf") pod "ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" (UID: "ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2"). InnerVolumeSpecName "kube-api-access-9n6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.137258 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerDied","Data":"bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18"} Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449911 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449680 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.987999 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:42:06 crc kubenswrapper[4885]: I0308 21:42:06.001713 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:42:07 crc kubenswrapper[4885]: I0308 21:42:07.384444 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" path="/var/lib/kubelet/pods/44e60165-e38f-4fbe-87a1-5908598e0e38/volumes" Mar 08 21:42:12 crc kubenswrapper[4885]: I0308 21:42:12.987684 4885 scope.go:117] "RemoveContainer" containerID="59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f" Mar 08 21:42:15 crc kubenswrapper[4885]: I0308 21:42:15.368522 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:42:16 crc kubenswrapper[4885]: I0308 21:42:16.596483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.281992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:33 crc kubenswrapper[4885]: E0308 21:42:33.283081 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.283097 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.283353 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.290852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.327123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514400 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514997 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.515140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.551202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.622441 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.155051 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834486 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" exitCode=0 Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834868 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab"} Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"5ffb02efe981c8b3a4fd909c1a9ac7dac5ae00bcc0dc55e21f8153d606172eb7"} Mar 08 21:42:36 crc kubenswrapper[4885]: I0308 21:42:36.868866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} Mar 08 21:42:37 crc kubenswrapper[4885]: I0308 21:42:37.884117 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" exitCode=0 Mar 08 21:42:37 crc kubenswrapper[4885]: I0308 21:42:37.884211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} Mar 08 21:42:38 crc kubenswrapper[4885]: I0308 21:42:38.901997 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} Mar 08 21:42:38 crc kubenswrapper[4885]: I0308 21:42:38.933724 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-js2l5" podStartSLOduration=2.362670671 podStartE2EDuration="5.933710615s" podCreationTimestamp="2026-03-08 21:42:33 +0000 UTC" firstStartedPulling="2026-03-08 21:42:34.839144426 +0000 UTC m=+7856.235198489" lastFinishedPulling="2026-03-08 21:42:38.41018441 +0000 UTC m=+7859.806238433" observedRunningTime="2026-03-08 21:42:38.930601722 +0000 UTC m=+7860.326655775" watchObservedRunningTime="2026-03-08 21:42:38.933710615 +0000 UTC m=+7860.329764638" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.623124 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.623907 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.721151 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:44 crc kubenswrapper[4885]: I0308 21:42:44.072750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:44 crc kubenswrapper[4885]: I0308 21:42:44.155626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.046291 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-js2l5" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" containerID="cri-o://4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" gracePeriod=2 Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.604467 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787186 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787516 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787612 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.788741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities" (OuterVolumeSpecName: "utilities") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.799718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v" (OuterVolumeSpecName: "kube-api-access-8jl6v") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "kube-api-access-8jl6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.858229 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890277 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890286 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059391 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" exitCode=0 Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059462 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059482 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"5ffb02efe981c8b3a4fd909c1a9ac7dac5ae00bcc0dc55e21f8153d606172eb7"} Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059500 4885 scope.go:117] "RemoveContainer" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.085881 4885 scope.go:117] "RemoveContainer" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.105014 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.118372 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.130546 4885 scope.go:117] "RemoveContainer" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.169566 4885 scope.go:117] "RemoveContainer" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.170245 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": container with ID starting with 4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e not found: ID does not exist" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170303 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} err="failed to get container status \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": rpc error: code = NotFound desc = could not find container \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": container with ID starting with 4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170336 4885 scope.go:117] "RemoveContainer" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.170805 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": container with ID starting with 02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079 not found: ID does not exist" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170841 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} err="failed to get container status \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": rpc error: code = NotFound desc = could not find container \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": container with ID starting with 02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079 not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170865 4885 scope.go:117] "RemoveContainer" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.171156 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": container with ID starting with 11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab not found: ID does not exist" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.171187 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab"} err="failed to get container status \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": rpc error: code = NotFound desc = could not find container \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": container with ID starting with 11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.382640 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" path="/var/lib/kubelet/pods/72aa9440-1700-4412-a459-8a62c4ee863b/volumes" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.866423 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867528 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-utilities" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867546 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-utilities" Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867567 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867575 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-content" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867599 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-content" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867899 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.869853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.894050 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061274 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061647 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.163975 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.185180 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.202531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.690691 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.295867 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" exitCode=0 Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.296176 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90"} Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.296207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"0b7182a340904e31bab48ebea5ffe43ac333be6ef60964d49770c8a77207a9b7"} Mar 08 21:43:05 crc kubenswrapper[4885]: I0308 21:43:05.312990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} Mar 08 21:43:06 crc kubenswrapper[4885]: I0308 21:43:06.325711 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" exitCode=0 Mar 08 21:43:06 crc kubenswrapper[4885]: I0308 21:43:06.325761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} Mar 08 21:43:07 crc kubenswrapper[4885]: I0308 21:43:07.351555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} Mar 08 21:43:07 crc kubenswrapper[4885]: I0308 21:43:07.380540 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fs7zb" podStartSLOduration=2.9091557200000002 podStartE2EDuration="5.380521391s" podCreationTimestamp="2026-03-08 21:43:02 +0000 UTC" firstStartedPulling="2026-03-08 21:43:04.298425579 +0000 UTC m=+7885.694479622" lastFinishedPulling="2026-03-08 21:43:06.76979124 +0000 UTC m=+7888.165845293" observedRunningTime="2026-03-08 21:43:07.373802321 +0000 UTC m=+7888.769856344" watchObservedRunningTime="2026-03-08 21:43:07.380521391 +0000 UTC m=+7888.776575414" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.203174 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.203773 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.280500 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.505899 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.585104 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:15 crc kubenswrapper[4885]: I0308 21:43:15.451817 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fs7zb" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" containerID="cri-o://70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" gracePeriod=2 Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.162979 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302137 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302373 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302622 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.303695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities" (OuterVolumeSpecName: "utilities") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.311276 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq" (OuterVolumeSpecName: "kube-api-access-tlwtq") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "kube-api-access-tlwtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.312479 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.312560 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.340295 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.413937 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464748 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" exitCode=0 Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"0b7182a340904e31bab48ebea5ffe43ac333be6ef60964d49770c8a77207a9b7"} Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464847 4885 scope.go:117] "RemoveContainer" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.465080 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.509127 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.511700 4885 scope.go:117] "RemoveContainer" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.519746 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.540209 4885 scope.go:117] "RemoveContainer" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.601434 4885 scope.go:117] "RemoveContainer" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.602244 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": container with ID starting with 70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e not found: ID does not exist" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.602370 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} err="failed to get container status \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": rpc error: code = NotFound desc = could not find container \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": container with ID starting with 70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e not found: ID does not exist" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.602487 4885 scope.go:117] "RemoveContainer" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.603195 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": container with ID starting with 4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec not found: ID does not exist" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603272 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} err="failed to get container status \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": rpc error: code = NotFound desc = could not find container \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": container with ID starting with 4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec not found: ID does not exist" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603322 4885 scope.go:117] "RemoveContainer" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.603852 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": container with ID starting with f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90 not found: ID does not exist" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603890 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90"} err="failed to get container status \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": rpc error: code = NotFound desc = could not find container \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": container with ID starting with f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90 not found: ID does not exist" Mar 08 21:43:17 crc kubenswrapper[4885]: I0308 21:43:17.379221 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" path="/var/lib/kubelet/pods/f64d2024-4751-4d9e-8565-bd06234f5388/volumes" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.158905 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160247 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160273 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160339 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-utilities" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160353 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-utilities" Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160395 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-content" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-content" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160840 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.162296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.166692 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.166946 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.173504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.176495 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.295411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.397722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.417244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.485580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:01 crc kubenswrapper[4885]: I0308 21:44:01.000014 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:01 crc kubenswrapper[4885]: I0308 21:44:01.067232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerStarted","Data":"ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13"} Mar 08 21:44:03 crc kubenswrapper[4885]: I0308 21:44:03.087727 4885 generic.go:334] "Generic (PLEG): container finished" podID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerID="dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c" exitCode=0 Mar 08 21:44:03 crc kubenswrapper[4885]: I0308 21:44:03.087856 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerDied","Data":"dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c"} Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.573971 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.587265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"9887f7af-0796-4481-aa3d-5f4996f9ed47\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.603343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l" (OuterVolumeSpecName: "kube-api-access-s5r9l") pod "9887f7af-0796-4481-aa3d-5f4996f9ed47" (UID: "9887f7af-0796-4481-aa3d-5f4996f9ed47"). InnerVolumeSpecName "kube-api-access-s5r9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.692123 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") on node \"crc\" DevicePath \"\"" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerDied","Data":"ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13"} Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115121 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115199 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.685160 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.696028 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:44:07 crc kubenswrapper[4885]: I0308 21:44:07.413381 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" path="/var/lib/kubelet/pods/67c51bcd-c065-4fa7-8318-0d0704836166/volumes" Mar 08 21:44:13 crc kubenswrapper[4885]: I0308 21:44:13.149342 4885 scope.go:117] "RemoveContainer" containerID="2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7" Mar 08 21:44:32 crc kubenswrapper[4885]: I0308 21:44:32.817968 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:44:32 crc kubenswrapper[4885]: I0308 21:44:32.819045 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.164337 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:00 crc kubenswrapper[4885]: E0308 21:45:00.165436 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.165453 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.165707 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.166598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.170601 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.170857 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.176065 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.250751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.251264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.251377 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.354696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.360796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.377202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.496381 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.006899 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:01 crc kubenswrapper[4885]: W0308 21:45:01.011226 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e47a3ad_b255_4044_b68b_42a99706339d.slice/crio-362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14 WatchSource:0}: Error finding container 362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14: Status 404 returned error can't find the container with id 362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14 Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.877724 4885 generic.go:334] "Generic (PLEG): container finished" podID="1e47a3ad-b255-4044-b68b-42a99706339d" containerID="7169885beaf1c90ab8d33a9c81aa17dfc1d7ecd0abd67f01755ae2b597cfaa85" exitCode=0 Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.877950 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerDied","Data":"7169885beaf1c90ab8d33a9c81aa17dfc1d7ecd0abd67f01755ae2b597cfaa85"} Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.878162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerStarted","Data":"362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14"} Mar 08 21:45:02 crc kubenswrapper[4885]: I0308 21:45:02.818589 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:45:02 crc kubenswrapper[4885]: I0308 21:45:02.818887 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.345422 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.522709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.529638 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26" (OuterVolumeSpecName: "kube-api-access-gwp26") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "kube-api-access-gwp26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.531598 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625516 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625574 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625595 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.906602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerDied","Data":"362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14"} Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.907079 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.906705 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:04 crc kubenswrapper[4885]: I0308 21:45:04.462782 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:45:04 crc kubenswrapper[4885]: I0308 21:45:04.475857 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:45:05 crc kubenswrapper[4885]: I0308 21:45:05.390340 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" path="/var/lib/kubelet/pods/62c3bcc1-5dd6-411d-8030-a152617aa0a3/volumes" Mar 08 21:45:13 crc kubenswrapper[4885]: I0308 21:45:13.284161 4885 scope.go:117] "RemoveContainer" containerID="e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.818733 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.819334 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.819407 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.820653 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.820750 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" gracePeriod=600 Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.259492 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" exitCode=0 Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.259575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.260206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.260345 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.198233 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:00 crc kubenswrapper[4885]: E0308 21:46:00.199684 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.199708 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.200174 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.201532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204519 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204599 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204715 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.215984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.278177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.380089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.399667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.540167 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:01 crc kubenswrapper[4885]: I0308 21:46:01.086134 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:01 crc kubenswrapper[4885]: W0308 21:46:01.094138 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e0c5067_1184_4a75_a80b_35b1d03f2a47.slice/crio-94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a WatchSource:0}: Error finding container 94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a: Status 404 returned error can't find the container with id 94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a Mar 08 21:46:01 crc kubenswrapper[4885]: I0308 21:46:01.631177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerStarted","Data":"94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a"} Mar 08 21:46:03 crc kubenswrapper[4885]: I0308 21:46:03.658125 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerID="ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0" exitCode=0 Mar 08 21:46:03 crc kubenswrapper[4885]: I0308 21:46:03.658207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerDied","Data":"ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.176286 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.336361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.342337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m" (OuterVolumeSpecName: "kube-api-access-gqs4m") pod "7e0c5067-1184-4a75-a80b-35b1d03f2a47" (UID: "7e0c5067-1184-4a75-a80b-35b1d03f2a47"). InnerVolumeSpecName "kube-api-access-gqs4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.439515 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.692670 4885 generic.go:334] "Generic (PLEG): container finished" podID="992d3500-f892-42c6-805f-ae9c96793d0f" containerID="135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b" exitCode=0 Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.692809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerDied","Data":"135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerDied","Data":"94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697182 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697190 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:06 crc kubenswrapper[4885]: I0308 21:46:06.280915 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:46:06 crc kubenswrapper[4885]: I0308 21:46:06.296324 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.231208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382492 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382594 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382727 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382764 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" path="/var/lib/kubelet/pods/70f4584f-eeba-4b88-b31d-79a39f062bd3/volumes" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.389413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph" (OuterVolumeSpecName: "ceph") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.389836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c" (OuterVolumeSpecName: "kube-api-access-2dz7c") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "kube-api-access-2dz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.401239 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.427525 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.431998 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.438031 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory" (OuterVolumeSpecName: "inventory") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486552 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486597 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486615 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486630 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486643 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486656 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724620 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerDied","Data":"2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b"} Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724889 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832019 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:07 crc kubenswrapper[4885]: E0308 21:46:07.832479 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832500 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: E0308 21:46:07.832519 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832526 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832729 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.833519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.835514 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.835639 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837095 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837140 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837724 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.841838 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.851782 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.894808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.894850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895445 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000279 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000404 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000487 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000573 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000681 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.003672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.003739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.006743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.007311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.007458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.010311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.010403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.011301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.011461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.014685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.022898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.023359 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.026163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.158855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.736138 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:08 crc kubenswrapper[4885]: W0308 21:46:08.741061 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaecb4202_1208_4ba5_8515_2ecf99c8c7d1.slice/crio-fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797 WatchSource:0}: Error finding container fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797: Status 404 returned error can't find the container with id fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797 Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.749540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerStarted","Data":"fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039"} Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.749840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerStarted","Data":"fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797"} Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.785503 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" podStartSLOduration=2.208969537 podStartE2EDuration="2.785482945s" podCreationTimestamp="2026-03-08 21:46:07 +0000 UTC" firstStartedPulling="2026-03-08 21:46:08.744682173 +0000 UTC m=+8070.140736206" lastFinishedPulling="2026-03-08 21:46:09.321195581 +0000 UTC m=+8070.717249614" observedRunningTime="2026-03-08 21:46:09.776465755 +0000 UTC m=+8071.172519788" watchObservedRunningTime="2026-03-08 21:46:09.785482945 +0000 UTC m=+8071.181536988" Mar 08 21:46:13 crc kubenswrapper[4885]: I0308 21:46:13.384713 4885 scope.go:117] "RemoveContainer" containerID="51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.148463 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.150141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152345 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152863 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152892 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.174710 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.315781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.418135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.449224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.486577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.057488 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.059050 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.240891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerStarted","Data":"2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1"} Mar 08 21:48:02 crc kubenswrapper[4885]: I0308 21:48:02.819172 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:48:02 crc kubenswrapper[4885]: I0308 21:48:02.819926 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:48:03 crc kubenswrapper[4885]: I0308 21:48:03.265434 4885 generic.go:334] "Generic (PLEG): container finished" podID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerID="d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332" exitCode=0 Mar 08 21:48:03 crc kubenswrapper[4885]: I0308 21:48:03.265753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerDied","Data":"d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332"} Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.700551 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.827435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"ea33e917-a39b-4c83-b80a-9562ddbc2459\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.841378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp" (OuterVolumeSpecName: "kube-api-access-hxttp") pod "ea33e917-a39b-4c83-b80a-9562ddbc2459" (UID: "ea33e917-a39b-4c83-b80a-9562ddbc2459"). InnerVolumeSpecName "kube-api-access-hxttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.929927 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") on node \"crc\" DevicePath \"\"" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerDied","Data":"2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1"} Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297984 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.785699 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.796370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:48:07 crc kubenswrapper[4885]: I0308 21:48:07.397798 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" path="/var/lib/kubelet/pods/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2/volumes" Mar 08 21:48:13 crc kubenswrapper[4885]: I0308 21:48:13.546502 4885 scope.go:117] "RemoveContainer" containerID="353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5" Mar 08 21:48:32 crc kubenswrapper[4885]: I0308 21:48:32.818145 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:48:32 crc kubenswrapper[4885]: I0308 21:48:32.818815 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:48:59 crc kubenswrapper[4885]: I0308 21:48:59.096090 4885 generic.go:334] "Generic (PLEG): container finished" podID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerID="fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039" exitCode=0 Mar 08 21:48:59 crc kubenswrapper[4885]: I0308 21:48:59.096203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerDied","Data":"fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039"} Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.618797 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.743876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.743995 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744400 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744552 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744661 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744790 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.749937 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm" (OuterVolumeSpecName: "kube-api-access-j8bgm") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "kube-api-access-j8bgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.750590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph" (OuterVolumeSpecName: "ceph") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.763514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.774556 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory" (OuterVolumeSpecName: "inventory") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.777017 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.785804 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.788327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.800805 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.806313 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.811699 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.812147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.814037 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.822636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846938 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846968 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846980 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846989 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847000 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847007 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847015 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847024 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847032 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847040 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847047 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847057 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847066 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerDied","Data":"fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797"} Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123576 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123851 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.245724 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:01 crc kubenswrapper[4885]: E0308 21:49:01.246267 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246291 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: E0308 21:49:01.246315 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246325 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246601 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246622 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.247602 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.250832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.250855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251024 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251207 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.255198 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363445 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.466512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.466856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.468157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.468344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.471418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.471900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.473854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.474155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.475156 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.476082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.476439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.495659 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.619410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.181786 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.818752 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819094 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819141 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819829 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819886 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" gracePeriod=600 Mar 08 21:49:02 crc kubenswrapper[4885]: E0308 21:49:02.949388 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.146843 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" exitCode=0 Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.146973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.147050 4885 scope.go:117] "RemoveContainer" containerID="5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.147866 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:03 crc kubenswrapper[4885]: E0308 21:49:03.148315 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.148753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerStarted","Data":"3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.148800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerStarted","Data":"7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.200556 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" podStartSLOduration=1.683922531 podStartE2EDuration="2.200538341s" podCreationTimestamp="2026-03-08 21:49:01 +0000 UTC" firstStartedPulling="2026-03-08 21:49:02.191365342 +0000 UTC m=+8243.587419405" lastFinishedPulling="2026-03-08 21:49:02.707981182 +0000 UTC m=+8244.104035215" observedRunningTime="2026-03-08 21:49:03.190069092 +0000 UTC m=+8244.586123155" watchObservedRunningTime="2026-03-08 21:49:03.200538341 +0000 UTC m=+8244.596592364" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.681655 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.684200 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.712198 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.803794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.804035 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.804289 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906308 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.907120 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.907169 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.930582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:15 crc kubenswrapper[4885]: I0308 21:49:15.006103 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:15 crc kubenswrapper[4885]: I0308 21:49:15.527717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.300379 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" exitCode=0 Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.300492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3"} Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.301049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"07317635e0f0429e716d66e0e8c7d0b42354a41fa7123aaa571931c9b0b000f0"} Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.368045 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:16 crc kubenswrapper[4885]: E0308 21:49:16.368296 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:18 crc kubenswrapper[4885]: I0308 21:49:18.355602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} Mar 08 21:49:22 crc kubenswrapper[4885]: I0308 21:49:22.399593 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" exitCode=0 Mar 08 21:49:22 crc kubenswrapper[4885]: I0308 21:49:22.399675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} Mar 08 21:49:23 crc kubenswrapper[4885]: I0308 21:49:23.413505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} Mar 08 21:49:23 crc kubenswrapper[4885]: I0308 21:49:23.446216 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-988dx" podStartSLOduration=2.8726832460000002 podStartE2EDuration="9.446194379s" podCreationTimestamp="2026-03-08 21:49:14 +0000 UTC" firstStartedPulling="2026-03-08 21:49:16.302899347 +0000 UTC m=+8257.698953370" lastFinishedPulling="2026-03-08 21:49:22.87641048 +0000 UTC m=+8264.272464503" observedRunningTime="2026-03-08 21:49:23.4342472 +0000 UTC m=+8264.830301243" watchObservedRunningTime="2026-03-08 21:49:23.446194379 +0000 UTC m=+8264.842248412" Mar 08 21:49:25 crc kubenswrapper[4885]: I0308 21:49:25.006307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:25 crc kubenswrapper[4885]: I0308 21:49:25.006639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:26 crc kubenswrapper[4885]: I0308 21:49:26.069829 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-988dx" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" probeResult="failure" output=< Mar 08 21:49:26 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:49:26 crc kubenswrapper[4885]: > Mar 08 21:49:30 crc kubenswrapper[4885]: I0308 21:49:30.369385 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:30 crc kubenswrapper[4885]: E0308 21:49:30.370284 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.067677 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.136566 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.313464 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:36 crc kubenswrapper[4885]: I0308 21:49:36.583324 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-988dx" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" containerID="cri-o://0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" gracePeriod=2 Mar 08 21:49:36 crc kubenswrapper[4885]: E0308 21:49:36.731344 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943c3a62_1a72_444a_b860_733dfdac5b16.slice/crio-conmon-0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.209171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240212 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.241902 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities" (OuterVolumeSpecName: "utilities") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.251903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq" (OuterVolumeSpecName: "kube-api-access-6pslq") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "kube-api-access-6pslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.343463 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.343520 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.400783 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.446256 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.598904 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" exitCode=0 Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.598991 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"07317635e0f0429e716d66e0e8c7d0b42354a41fa7123aaa571931c9b0b000f0"} Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599090 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599096 4885 scope.go:117] "RemoveContainer" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.658952 4885 scope.go:117] "RemoveContainer" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.669147 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.690994 4885 scope.go:117] "RemoveContainer" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.694073 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769270 4885 scope.go:117] "RemoveContainer" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.769782 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": container with ID starting with 0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6 not found: ID does not exist" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769830 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} err="failed to get container status \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": rpc error: code = NotFound desc = could not find container \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": container with ID starting with 0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6 not found: ID does not exist" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769855 4885 scope.go:117] "RemoveContainer" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.770317 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": container with ID starting with d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c not found: ID does not exist" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.770343 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} err="failed to get container status \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": rpc error: code = NotFound desc = could not find container \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": container with ID starting with d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c not found: ID does not exist" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.770364 4885 scope.go:117] "RemoveContainer" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.771139 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": container with ID starting with 28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3 not found: ID does not exist" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.771164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3"} err="failed to get container status \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": rpc error: code = NotFound desc = could not find container \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": container with ID starting with 28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3 not found: ID does not exist" Mar 08 21:49:39 crc kubenswrapper[4885]: I0308 21:49:39.386482 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" path="/var/lib/kubelet/pods/943c3a62-1a72-444a-b860-733dfdac5b16/volumes" Mar 08 21:49:44 crc kubenswrapper[4885]: I0308 21:49:44.368522 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:44 crc kubenswrapper[4885]: E0308 21:49:44.369312 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:55 crc kubenswrapper[4885]: I0308 21:49:55.368498 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:55 crc kubenswrapper[4885]: E0308 21:49:55.369582 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.150914 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.151954 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.151971 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.151993 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-content" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152002 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-content" Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.152035 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-utilities" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152041 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-utilities" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152261 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.153169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156163 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156184 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.167749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.247868 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.351508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.373634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.477499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.998149 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:01 crc kubenswrapper[4885]: I0308 21:50:01.881395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerStarted","Data":"924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601"} Mar 08 21:50:02 crc kubenswrapper[4885]: I0308 21:50:02.892419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerStarted","Data":"6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993"} Mar 08 21:50:02 crc kubenswrapper[4885]: I0308 21:50:02.922077 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" podStartSLOduration=1.6907394519999999 podStartE2EDuration="2.922054456s" podCreationTimestamp="2026-03-08 21:50:00 +0000 UTC" firstStartedPulling="2026-03-08 21:50:01.002134534 +0000 UTC m=+8302.398188557" lastFinishedPulling="2026-03-08 21:50:02.233449528 +0000 UTC m=+8303.629503561" observedRunningTime="2026-03-08 21:50:02.909165392 +0000 UTC m=+8304.305219425" watchObservedRunningTime="2026-03-08 21:50:02.922054456 +0000 UTC m=+8304.318108489" Mar 08 21:50:03 crc kubenswrapper[4885]: I0308 21:50:03.906406 4885 generic.go:334] "Generic (PLEG): container finished" podID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerID="6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993" exitCode=0 Mar 08 21:50:03 crc kubenswrapper[4885]: I0308 21:50:03.906553 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerDied","Data":"6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993"} Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.406783 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.510454 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.530428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf" (OuterVolumeSpecName: "kube-api-access-bzsnf") pod "baf0e32a-60d4-4a44-af91-6bbe65bc82c9" (UID: "baf0e32a-60d4-4a44-af91-6bbe65bc82c9"). InnerVolumeSpecName "kube-api-access-bzsnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.532164 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") on node \"crc\" DevicePath \"\"" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936348 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerDied","Data":"924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601"} Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936723 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936655 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:06 crc kubenswrapper[4885]: I0308 21:50:06.020162 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:50:06 crc kubenswrapper[4885]: I0308 21:50:06.029087 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:50:07 crc kubenswrapper[4885]: I0308 21:50:07.381030 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" path="/var/lib/kubelet/pods/9887f7af-0796-4481-aa3d-5f4996f9ed47/volumes" Mar 08 21:50:09 crc kubenswrapper[4885]: I0308 21:50:09.378283 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:09 crc kubenswrapper[4885]: E0308 21:50:09.378731 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:13 crc kubenswrapper[4885]: I0308 21:50:13.676865 4885 scope.go:117] "RemoveContainer" containerID="dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c" Mar 08 21:50:21 crc kubenswrapper[4885]: I0308 21:50:21.369173 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:21 crc kubenswrapper[4885]: E0308 21:50:21.369963 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:32 crc kubenswrapper[4885]: I0308 21:50:32.369223 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:32 crc kubenswrapper[4885]: E0308 21:50:32.370314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:45 crc kubenswrapper[4885]: I0308 21:50:45.368423 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:45 crc kubenswrapper[4885]: E0308 21:50:45.369830 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:56 crc kubenswrapper[4885]: I0308 21:50:56.368775 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:56 crc kubenswrapper[4885]: E0308 21:50:56.370137 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:07 crc kubenswrapper[4885]: I0308 21:51:07.370148 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:07 crc kubenswrapper[4885]: E0308 21:51:07.371248 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:19 crc kubenswrapper[4885]: I0308 21:51:19.385453 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:19 crc kubenswrapper[4885]: E0308 21:51:19.386591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:34 crc kubenswrapper[4885]: I0308 21:51:34.369191 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:34 crc kubenswrapper[4885]: E0308 21:51:34.370046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:45 crc kubenswrapper[4885]: I0308 21:51:45.368556 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:45 crc kubenswrapper[4885]: E0308 21:51:45.369425 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:56 crc kubenswrapper[4885]: I0308 21:51:56.369143 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:56 crc kubenswrapper[4885]: E0308 21:51:56.370467 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.153087 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:00 crc kubenswrapper[4885]: E0308 21:52:00.154357 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.154376 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.154667 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.155752 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158098 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158687 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158753 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.168137 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.188102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.289727 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.323153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.483578 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:01 crc kubenswrapper[4885]: I0308 21:52:01.051811 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:01 crc kubenswrapper[4885]: I0308 21:52:01.442692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerStarted","Data":"d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20"} Mar 08 21:52:02 crc kubenswrapper[4885]: I0308 21:52:02.457461 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerStarted","Data":"8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa"} Mar 08 21:52:02 crc kubenswrapper[4885]: I0308 21:52:02.484279 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" podStartSLOduration=1.5832814960000001 podStartE2EDuration="2.48425303s" podCreationTimestamp="2026-03-08 21:52:00 +0000 UTC" firstStartedPulling="2026-03-08 21:52:01.054465173 +0000 UTC m=+8422.450519196" lastFinishedPulling="2026-03-08 21:52:01.955436707 +0000 UTC m=+8423.351490730" observedRunningTime="2026-03-08 21:52:02.479798541 +0000 UTC m=+8423.875852604" watchObservedRunningTime="2026-03-08 21:52:02.48425303 +0000 UTC m=+8423.880307083" Mar 08 21:52:03 crc kubenswrapper[4885]: I0308 21:52:03.480807 4885 generic.go:334] "Generic (PLEG): container finished" podID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerID="8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa" exitCode=0 Mar 08 21:52:03 crc kubenswrapper[4885]: I0308 21:52:03.480864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerDied","Data":"8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa"} Mar 08 21:52:04 crc kubenswrapper[4885]: I0308 21:52:04.864646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.006951 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.015201 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td" (OuterVolumeSpecName: "kube-api-access-p54td") pod "2aae4f06-bf3b-4963-92b4-9dfc6bb69621" (UID: "2aae4f06-bf3b-4963-92b4-9dfc6bb69621"). InnerVolumeSpecName "kube-api-access-p54td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.111121 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523705 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerDied","Data":"d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20"} Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523770 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523848 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.577946 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.590280 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:52:07 crc kubenswrapper[4885]: I0308 21:52:07.385568 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" path="/var/lib/kubelet/pods/7e0c5067-1184-4a75-a80b-35b1d03f2a47/volumes" Mar 08 21:52:10 crc kubenswrapper[4885]: I0308 21:52:10.370733 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:10 crc kubenswrapper[4885]: E0308 21:52:10.371867 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:13 crc kubenswrapper[4885]: I0308 21:52:13.816005 4885 scope.go:117] "RemoveContainer" containerID="ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0" Mar 08 21:52:23 crc kubenswrapper[4885]: I0308 21:52:23.369102 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:23 crc kubenswrapper[4885]: E0308 21:52:23.370188 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:36 crc kubenswrapper[4885]: I0308 21:52:36.936796 4885 generic.go:334] "Generic (PLEG): container finished" podID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerID="3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a" exitCode=0 Mar 08 21:52:36 crc kubenswrapper[4885]: I0308 21:52:36.936858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerDied","Data":"3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a"} Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.371412 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:38 crc kubenswrapper[4885]: E0308 21:52:38.372210 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.562601 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.726895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727075 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727459 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.733286 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.733684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph" (OuterVolumeSpecName: "ceph") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.734582 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5" (OuterVolumeSpecName: "kube-api-access-774f5") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "kube-api-access-774f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.760282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.776196 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.778994 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.787975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.792850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory" (OuterVolumeSpecName: "inventory") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831597 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831654 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831681 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831703 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831722 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831740 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831753 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831764 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.966914 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerDied","Data":"7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3"} Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.967002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.967185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.089801 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:39 crc kubenswrapper[4885]: E0308 21:52:39.090319 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090337 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: E0308 21:52:39.090375 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090382 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090605 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.091423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094370 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094673 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.095037 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.105766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.242259 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.242402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344495 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.349646 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.350415 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.351729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.354091 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.355320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.377029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.417039 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.058503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.995658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerStarted","Data":"cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d"} Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.996093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerStarted","Data":"e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234"} Mar 08 21:52:41 crc kubenswrapper[4885]: I0308 21:52:41.022271 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" podStartSLOduration=1.519376574 podStartE2EDuration="2.022241054s" podCreationTimestamp="2026-03-08 21:52:39 +0000 UTC" firstStartedPulling="2026-03-08 21:52:40.064588737 +0000 UTC m=+8461.460642760" lastFinishedPulling="2026-03-08 21:52:40.567453177 +0000 UTC m=+8461.963507240" observedRunningTime="2026-03-08 21:52:41.015510094 +0000 UTC m=+8462.411564157" watchObservedRunningTime="2026-03-08 21:52:41.022241054 +0000 UTC m=+8462.418295117" Mar 08 21:52:51 crc kubenswrapper[4885]: I0308 21:52:51.368666 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:51 crc kubenswrapper[4885]: E0308 21:52:51.369673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:02 crc kubenswrapper[4885]: I0308 21:53:02.368982 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:02 crc kubenswrapper[4885]: E0308 21:53:02.370187 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:13 crc kubenswrapper[4885]: I0308 21:53:13.368630 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:13 crc kubenswrapper[4885]: E0308 21:53:13.369518 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.113226 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.115843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.156505 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171730 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.274109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.274194 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.297865 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.436374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.936482 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:23 crc kubenswrapper[4885]: I0308 21:53:23.523067 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerStarted","Data":"630ba5940a762a4ce58dfa8803f182a3722e17630d480c5a02c18c12176e926c"} Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.535452 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" exitCode=0 Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.535533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9"} Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.538716 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:53:26 crc kubenswrapper[4885]: I0308 21:53:26.569195 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" exitCode=0 Mar 08 21:53:26 crc kubenswrapper[4885]: I0308 21:53:26.569273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5"} Mar 08 21:53:27 crc kubenswrapper[4885]: I0308 21:53:27.582043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerStarted","Data":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} Mar 08 21:53:27 crc kubenswrapper[4885]: I0308 21:53:27.622687 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p28nt" podStartSLOduration=3.075768656 podStartE2EDuration="5.622666776s" podCreationTimestamp="2026-03-08 21:53:22 +0000 UTC" firstStartedPulling="2026-03-08 21:53:24.538394474 +0000 UTC m=+8505.934448507" lastFinishedPulling="2026-03-08 21:53:27.085292604 +0000 UTC m=+8508.481346627" observedRunningTime="2026-03-08 21:53:27.603527484 +0000 UTC m=+8508.999581527" watchObservedRunningTime="2026-03-08 21:53:27.622666776 +0000 UTC m=+8509.018720809" Mar 08 21:53:28 crc kubenswrapper[4885]: I0308 21:53:28.369704 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:28 crc kubenswrapper[4885]: E0308 21:53:28.370322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.437006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.437359 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.503478 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.704105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.763328 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:34 crc kubenswrapper[4885]: I0308 21:53:34.665428 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p28nt" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" containerID="cri-o://914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" gracePeriod=2 Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.289348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.396591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.396786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.397036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.398027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities" (OuterVolumeSpecName: "utilities") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.398412 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.410813 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql" (OuterVolumeSpecName: "kube-api-access-5mdql") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "kube-api-access-5mdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.447225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.500467 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.500500 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.677852 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" exitCode=0 Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.677940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678009 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678045 4885 scope.go:117] "RemoveContainer" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"630ba5940a762a4ce58dfa8803f182a3722e17630d480c5a02c18c12176e926c"} Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.701208 4885 scope.go:117] "RemoveContainer" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.724003 4885 scope.go:117] "RemoveContainer" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.781419 4885 scope.go:117] "RemoveContainer" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.782247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": container with ID starting with 914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71 not found: ID does not exist" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.782425 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} err="failed to get container status \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": rpc error: code = NotFound desc = could not find container \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": container with ID starting with 914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.782596 4885 scope.go:117] "RemoveContainer" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.783199 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": container with ID starting with 153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5 not found: ID does not exist" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.783396 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5"} err="failed to get container status \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": rpc error: code = NotFound desc = could not find container \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": container with ID starting with 153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.783540 4885 scope.go:117] "RemoveContainer" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.787076 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": container with ID starting with 31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9 not found: ID does not exist" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.787127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9"} err="failed to get container status \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": rpc error: code = NotFound desc = could not find container \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": container with ID starting with 31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.788098 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.799624 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:37 crc kubenswrapper[4885]: I0308 21:53:37.402516 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" path="/var/lib/kubelet/pods/9c91d099-582a-47b1-b6b3-a403a4cdd428/volumes" Mar 08 21:53:39 crc kubenswrapper[4885]: I0308 21:53:39.374761 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:39 crc kubenswrapper[4885]: E0308 21:53:39.375228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:46 crc kubenswrapper[4885]: I0308 21:53:46.826007 4885 generic.go:334] "Generic (PLEG): container finished" podID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerID="cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d" exitCode=0 Mar 08 21:53:46 crc kubenswrapper[4885]: I0308 21:53:46.826087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerDied","Data":"cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d"} Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.457814 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567285 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567450 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567980 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.575185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph" (OuterVolumeSpecName: "ceph") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.575361 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9" (OuterVolumeSpecName: "kube-api-access-vxhz9") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "kube-api-access-vxhz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.579180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.619228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.619599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.622411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory" (OuterVolumeSpecName: "inventory") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.670944 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671108 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671197 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671271 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671348 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671420 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.856729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerDied","Data":"e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234"} Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.856774 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.857333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022065 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022748 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-content" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022768 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-content" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022783 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022792 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022851 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022866 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-utilities" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022874 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-utilities" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.023158 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.023182 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.024003 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.028639 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.029269 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.029734 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.030693 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.032386 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.060854 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080762 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080931 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183551 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183701 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.188243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.188523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.189214 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.192351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.192987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.203122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.380406 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:50 crc kubenswrapper[4885]: I0308 21:53:50.079235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:50 crc kubenswrapper[4885]: I0308 21:53:50.885058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerStarted","Data":"d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5"} Mar 08 21:53:51 crc kubenswrapper[4885]: I0308 21:53:51.899514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerStarted","Data":"3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee"} Mar 08 21:53:51 crc kubenswrapper[4885]: I0308 21:53:51.929224 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" podStartSLOduration=2.127837422 podStartE2EDuration="2.929196238s" podCreationTimestamp="2026-03-08 21:53:49 +0000 UTC" firstStartedPulling="2026-03-08 21:53:50.0752158 +0000 UTC m=+8531.471269823" lastFinishedPulling="2026-03-08 21:53:50.876574606 +0000 UTC m=+8532.272628639" observedRunningTime="2026-03-08 21:53:51.921685067 +0000 UTC m=+8533.317739110" watchObservedRunningTime="2026-03-08 21:53:51.929196238 +0000 UTC m=+8533.325250281" Mar 08 21:53:52 crc kubenswrapper[4885]: I0308 21:53:52.369000 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:52 crc kubenswrapper[4885]: E0308 21:53:52.369782 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.156250 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.159444 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.162674 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.162844 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.163174 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.172584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.270918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.373135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.395074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.483726 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.996939 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:01 crc kubenswrapper[4885]: I0308 21:54:01.045580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerStarted","Data":"306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61"} Mar 08 21:54:03 crc kubenswrapper[4885]: I0308 21:54:03.069537 4885 generic.go:334] "Generic (PLEG): container finished" podID="28077c49-e447-4b53-ab0a-078b678e322e" containerID="3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d" exitCode=0 Mar 08 21:54:03 crc kubenswrapper[4885]: I0308 21:54:03.070005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerDied","Data":"3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d"} Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.515007 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.591757 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"28077c49-e447-4b53-ab0a-078b678e322e\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.599810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg" (OuterVolumeSpecName: "kube-api-access-q94gg") pod "28077c49-e447-4b53-ab0a-078b678e322e" (UID: "28077c49-e447-4b53-ab0a-078b678e322e"). InnerVolumeSpecName "kube-api-access-q94gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.694637 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerDied","Data":"306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61"} Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091574 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091634 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.623331 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.635505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:54:07 crc kubenswrapper[4885]: I0308 21:54:07.371291 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:54:07 crc kubenswrapper[4885]: I0308 21:54:07.395081 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" path="/var/lib/kubelet/pods/ea33e917-a39b-4c83-b80a-9562ddbc2459/volumes" Mar 08 21:54:08 crc kubenswrapper[4885]: I0308 21:54:08.128845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} Mar 08 21:54:13 crc kubenswrapper[4885]: I0308 21:54:13.955528 4885 scope.go:117] "RemoveContainer" containerID="d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.105852 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:43 crc kubenswrapper[4885]: E0308 21:54:43.106854 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.106868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.107167 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.109095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.122640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.173992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.174063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.174110 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.277113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.277369 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.304128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.442162 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.006335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631378 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" exitCode=0 Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494"} Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"cb90bf0b0e93febc0744f518d6e1727e51ee94fb0c9756ce8dc19a061955a61c"} Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.514759 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.519854 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.533887 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.534744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.534851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.535059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.638023 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.638163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.646627 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.663039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.850442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.473737 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.662108 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" exitCode=0 Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.662184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.664587 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"4e025ae4b6cfd5a9631429dbc5ad8db5f4f6ec52af704b61dc5685dd6b7fdcf7"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.683324 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" exitCode=0 Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.685215 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.702289 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.730813 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7wdkz" podStartSLOduration=2.285000941 podStartE2EDuration="4.730794784s" podCreationTimestamp="2026-03-08 21:54:43 +0000 UTC" firstStartedPulling="2026-03-08 21:54:44.634241944 +0000 UTC m=+8586.030296007" lastFinishedPulling="2026-03-08 21:54:47.080035837 +0000 UTC m=+8588.476089850" observedRunningTime="2026-03-08 21:54:47.730449855 +0000 UTC m=+8589.126503868" watchObservedRunningTime="2026-03-08 21:54:47.730794784 +0000 UTC m=+8589.126848807" Mar 08 21:54:49 crc kubenswrapper[4885]: I0308 21:54:49.724772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} Mar 08 21:54:51 crc kubenswrapper[4885]: I0308 21:54:51.752331 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" exitCode=0 Mar 08 21:54:51 crc kubenswrapper[4885]: I0308 21:54:51.752429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} Mar 08 21:54:52 crc kubenswrapper[4885]: I0308 21:54:52.773945 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} Mar 08 21:54:52 crc kubenswrapper[4885]: I0308 21:54:52.824697 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tvp9l" podStartSLOduration=3.363874249 podStartE2EDuration="7.824670267s" podCreationTimestamp="2026-03-08 21:54:45 +0000 UTC" firstStartedPulling="2026-03-08 21:54:47.68754081 +0000 UTC m=+8589.083594863" lastFinishedPulling="2026-03-08 21:54:52.148336828 +0000 UTC m=+8593.544390881" observedRunningTime="2026-03-08 21:54:52.811686691 +0000 UTC m=+8594.207740744" watchObservedRunningTime="2026-03-08 21:54:52.824670267 +0000 UTC m=+8594.220724330" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.443372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.443635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.508530 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.856213 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.502361 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.810017 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7wdkz" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" containerID="cri-o://2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" gracePeriod=2 Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.850801 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.850855 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.915074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.412895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.500869 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501101 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501774 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities" (OuterVolumeSpecName: "utilities") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.507134 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn" (OuterVolumeSpecName: "kube-api-access-fllwn") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "kube-api-access-fllwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.578426 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603764 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603806 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603821 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.820850 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" exitCode=0 Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.822584 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"cb90bf0b0e93febc0744f518d6e1727e51ee94fb0c9756ce8dc19a061955a61c"} Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823163 4885 scope.go:117] "RemoveContainer" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.844193 4885 scope.go:117] "RemoveContainer" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.865328 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.875197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.891950 4885 scope.go:117] "RemoveContainer" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.937709 4885 scope.go:117] "RemoveContainer" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.938266 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": container with ID starting with 2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4 not found: ID does not exist" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938375 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} err="failed to get container status \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": rpc error: code = NotFound desc = could not find container \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": container with ID starting with 2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4 not found: ID does not exist" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938450 4885 scope.go:117] "RemoveContainer" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.938833 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": container with ID starting with 06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a not found: ID does not exist" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938988 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} err="failed to get container status \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": rpc error: code = NotFound desc = could not find container \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": container with ID starting with 06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a not found: ID does not exist" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.939083 4885 scope.go:117] "RemoveContainer" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.939656 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": container with ID starting with 9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494 not found: ID does not exist" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.939773 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494"} err="failed to get container status \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": rpc error: code = NotFound desc = could not find container \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": container with ID starting with 9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494 not found: ID does not exist" Mar 08 21:54:57 crc kubenswrapper[4885]: I0308 21:54:57.390617 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" path="/var/lib/kubelet/pods/bc4b25e1-fca9-4293-bc47-4a56fa5be25e/volumes" Mar 08 21:55:05 crc kubenswrapper[4885]: I0308 21:55:05.928578 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:06 crc kubenswrapper[4885]: I0308 21:55:06.032493 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:06 crc kubenswrapper[4885]: I0308 21:55:06.963895 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tvp9l" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" containerID="cri-o://26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" gracePeriod=2 Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.616939 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.685736 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities" (OuterVolumeSpecName: "utilities") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.690329 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc" (OuterVolumeSpecName: "kube-api-access-5n2vc") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "kube-api-access-5n2vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.739052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788391 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788411 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.997602 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" exitCode=0 Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.998198 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.998218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.005425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"4e025ae4b6cfd5a9631429dbc5ad8db5f4f6ec52af704b61dc5685dd6b7fdcf7"} Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.005490 4885 scope.go:117] "RemoveContainer" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.059330 4885 scope.go:117] "RemoveContainer" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.063320 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.075209 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.099525 4885 scope.go:117] "RemoveContainer" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140345 4885 scope.go:117] "RemoveContainer" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.140713 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": container with ID starting with 26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca not found: ID does not exist" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140755 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} err="failed to get container status \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": rpc error: code = NotFound desc = could not find container \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": container with ID starting with 26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca not found: ID does not exist" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140807 4885 scope.go:117] "RemoveContainer" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.141057 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": container with ID starting with 10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018 not found: ID does not exist" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141099 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} err="failed to get container status \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": rpc error: code = NotFound desc = could not find container \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": container with ID starting with 10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018 not found: ID does not exist" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141118 4885 scope.go:117] "RemoveContainer" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.141356 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": container with ID starting with ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9 not found: ID does not exist" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141397 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9"} err="failed to get container status \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": rpc error: code = NotFound desc = could not find container \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": container with ID starting with ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9 not found: ID does not exist" Mar 08 21:55:09 crc kubenswrapper[4885]: I0308 21:55:09.387698 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" path="/var/lib/kubelet/pods/4eb826cc-db9d-4f89-9736-2365672249ac/volumes" Mar 08 21:55:12 crc kubenswrapper[4885]: I0308 21:55:12.055691 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerID="3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee" exitCode=0 Mar 08 21:55:12 crc kubenswrapper[4885]: I0308 21:55:12.055810 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerDied","Data":"3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee"} Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.586640 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636061 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636599 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636683 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.643048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph" (OuterVolumeSpecName: "ceph") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.646243 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.661154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f" (OuterVolumeSpecName: "kube-api-access-s7h4f") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "kube-api-access-s7h4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.666117 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory" (OuterVolumeSpecName: "inventory") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.675242 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.683746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740423 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740454 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740465 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740475 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740485 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740494 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.081998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerDied","Data":"d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5"} Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.082073 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.082097 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5" Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.575061 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.575913 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" gracePeriod=30 Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.615593 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.616284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.243740 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.243994 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" containerID="cri-o://949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.331613 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.332093 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" containerID="cri-o://2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.332137 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" containerID="cri-o://cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.348198 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.348763 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" containerID="cri-o://37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.349218 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" containerID="cri-o://88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.413289 4885 generic.go:334] "Generic (PLEG): container finished" podID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerID="a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" exitCode=0 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.413336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerDied","Data":"a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1"} Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.813141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.910891 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.911035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.911117 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.924703 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6" (OuterVolumeSpecName: "kube-api-access-5vfz6") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "kube-api-access-5vfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.939079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data" (OuterVolumeSpecName: "config-data") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.945023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014336 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014369 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014378 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.424285 4885 generic.go:334] "Generic (PLEG): container finished" podID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" exitCode=143 Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.424366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.426967 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" exitCode=143 Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.427032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerDied","Data":"4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428291 4885 scope.go:117] "RemoveContainer" containerID="a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428326 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.475547 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.491739 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503253 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503781 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503804 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503814 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503831 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503843 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503849 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503862 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503891 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503902 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503908 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503945 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503953 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504193 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504215 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504271 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504285 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.505116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.507429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.514433 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629283 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629522 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.736740 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.737444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.754651 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.766745 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.768745 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.769999 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.770045 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.825405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.298130 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.392639 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" path="/var/lib/kubelet/pods/95d9d37c-0204-47e9-956d-d93f2dd1e94d/volumes" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.446044 4885 generic.go:334] "Generic (PLEG): container finished" podID="b042d37f-f908-40b8-88be-21798a9428f6" containerID="3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" exitCode=0 Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.446081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerDied","Data":"3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf"} Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.447309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"390628a6-50b8-491e-bc5d-80a524b67be6","Type":"ContainerStarted","Data":"baff6c3712c2b2039b847d6c62c829fe033c897beac1cf5de551094705fb1cad"} Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.723318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.854744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.855580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.855696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.861189 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6" (OuterVolumeSpecName: "kube-api-access-tv9z6") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "kube-api-access-tv9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.887889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data" (OuterVolumeSpecName: "config-data") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.908668 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959537 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959617 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.464121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"390628a6-50b8-491e-bc5d-80a524b67be6","Type":"ContainerStarted","Data":"5a60c52e16059b05f1034589a8a5bab7af13e01f6b8ac1e2d16ee1100d38fd62"} Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.465820 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerDied","Data":"80d76197d607834e3f09a7896979af3b1a5308464484d654754ae837dd80a0ab"} Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479478 4885 scope.go:117] "RemoveContainer" containerID="3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479572 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.497271 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.497247931 podStartE2EDuration="2.497247931s" podCreationTimestamp="2026-03-08 21:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:40.484747207 +0000 UTC m=+8641.880801240" watchObservedRunningTime="2026-03-08 21:55:40.497247931 +0000 UTC m=+8641.893301944" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.501918 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": read tcp 10.217.0.2:49212->10.217.1.127:8775: read: connection reset by peer" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.502179 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": read tcp 10.217.0.2:49202->10.217.1.127:8775: read: connection reset by peer" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.540540 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.572001 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.579978 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: E0308 21:55:40.580484 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.580520 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.580743 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.581489 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.586507 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.590880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.696637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.696672 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.698082 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.805319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.805723 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.806615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.811536 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.840529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.846579 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.946161 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009242 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.010301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.010238 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs" (OuterVolumeSpecName: "logs") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.011072 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.013695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv" (OuterVolumeSpecName: "kube-api-access-56dvv") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "kube-api-access-56dvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.018875 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.048600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data" (OuterVolumeSpecName: "config-data") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.051993 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.053864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112508 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112812 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113308 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113321 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113331 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.114094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs" (OuterVolumeSpecName: "logs") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.123212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5" (OuterVolumeSpecName: "kube-api-access-ttvw5") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "kube-api-access-ttvw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.150034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.153868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data" (OuterVolumeSpecName: "config-data") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214256 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214293 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214308 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214319 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.382207 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b042d37f-f908-40b8-88be-21798a9428f6" path="/var/lib/kubelet/pods/b042d37f-f908-40b8-88be-21798a9428f6/volumes" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.493932 4885 generic.go:334] "Generic (PLEG): container finished" podID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" exitCode=0 Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.493991 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494330 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"e11dd41c5f061aeb15ad1b8571823004739345378aecc51a3efc7e9384ef84f7"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494352 4885 scope.go:117] "RemoveContainer" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497360 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" exitCode=0 Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497425 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"d6b2799aab2f7b019fc1cf13b4754da3de421e26bd6b42741c010f54fac4b62f"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.534482 4885 scope.go:117] "RemoveContainer" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.543977 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.561417 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.569159 4885 scope.go:117] "RemoveContainer" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.570124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": container with ID starting with cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be not found: ID does not exist" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.570164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} err="failed to get container status \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": rpc error: code = NotFound desc = could not find container \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": container with ID starting with cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.570188 4885 scope.go:117] "RemoveContainer" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.571592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572093 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572105 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572130 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572137 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572148 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572157 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572183 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572189 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572383 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572404 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572418 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572433 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.573569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.574588 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": container with ID starting with 2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea not found: ID does not exist" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.574619 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} err="failed to get container status \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": rpc error: code = NotFound desc = could not find container \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": container with ID starting with 2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.574636 4885 scope.go:117] "RemoveContainer" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.575554 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.581492 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.592595 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.604173 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.614979 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.616980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.618735 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.622895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.631388 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726020 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726087 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726267 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.727080 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.727417 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.733604 4885 scope.go:117] "RemoveContainer" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.733854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.734114 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.734113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.743422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.744632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.745088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.835734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.846231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.847503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850711 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850843 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850989 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.851132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.855483 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.855646 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.857640 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.864751 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.881454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.886846 4885 scope.go:117] "RemoveContainer" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.887461 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": container with ID starting with 88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc not found: ID does not exist" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887505 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} err="failed to get container status \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": rpc error: code = NotFound desc = could not find container \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": container with ID starting with 88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887550 4885 scope.go:117] "RemoveContainer" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.887875 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": container with ID starting with 37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1 not found: ID does not exist" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887934 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} err="failed to get container status \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": rpc error: code = NotFound desc = could not find container \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": container with ID starting with 37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1 not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.928654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929322 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929361 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929476 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929545 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929646 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036586 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036702 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036757 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.043977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.045305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.046095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.047881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.048307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.049327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.049356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.051316 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.052734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.053532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.053954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.062324 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.062959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.186531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.414146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.460216 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d37f17_3e80_43b1_b6e3_df2316900973.slice/crio-231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578 WatchSource:0}: Error finding container 231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578: Status 404 returned error can't find the container with id 231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578 Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.510073 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd37ef2_90bf_4ea4_86a1_2113a005824e.slice/crio-f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e WatchSource:0}: Error finding container f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e: Status 404 returned error can't find the container with id f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.515942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.525358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.528117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3534e95e-b33c-4294-98d0-f758ea92cf72","Type":"ContainerStarted","Data":"b6747750f2088baadaef68544fc734cc86f717816cc17ca0b34ff3b054312391"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.528147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3534e95e-b33c-4294-98d0-f758ea92cf72","Type":"ContainerStarted","Data":"76a16339acb7b6242d62b0f55ebec5fa9bfb3d918fc8b3c0d59cea3377bb772b"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.529203 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.547520 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5473522539999998 podStartE2EDuration="2.547352254s" podCreationTimestamp="2026-03-08 21:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:42.547027415 +0000 UTC m=+8643.943081438" watchObservedRunningTime="2026-03-08 21:55:42.547352254 +0000 UTC m=+8643.943406277" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.780727 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.793767 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac2d268_855a_485e_a96f_87b5cc0e4f6e.slice/crio-bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2 WatchSource:0}: Error finding container bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2: Status 404 returned error can't find the container with id bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2 Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.121174 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.169975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5" (OuterVolumeSpecName: "kube-api-access-lz7p5") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "kube-api-access-lz7p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.191965 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.202215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data" (OuterVolumeSpecName: "config-data") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267818 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267858 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267875 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.379434 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" path="/var/lib/kubelet/pods/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a/volumes" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.381042 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" path="/var/lib/kubelet/pods/f008edbb-a92d-45b1-ab9d-a56978d20e75/volumes" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.564474 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerStarted","Data":"bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"c0965d3d5ce970351c46ebe5f000b7b526f00ca05bc8cac2ff9b123f0e265806"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"455a26ce7ce5a1165b370ace53bfe8b11ccdb91628aafbb1bfecdcbd431e7048"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565988 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.574137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"f986557e81454bf1db78eec12e02883dfa242d341cd8fe143e711a41264a5899"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.574369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"9c553ecbf8b599e7d8354dccb7e5f894debb74419e8225b98b8d5cf3ffbbb67b"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580131 4885 generic.go:334] "Generic (PLEG): container finished" podID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" exitCode=0 Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerDied","Data":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerDied","Data":"ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580334 4885 scope.go:117] "RemoveContainer" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580413 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.605143 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.605124633 podStartE2EDuration="2.605124633s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:43.588684024 +0000 UTC m=+8644.984738047" watchObservedRunningTime="2026-03-08 21:55:43.605124633 +0000 UTC m=+8645.001178656" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.618548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.618523031 podStartE2EDuration="2.618523031s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:43.618187061 +0000 UTC m=+8645.014241104" watchObservedRunningTime="2026-03-08 21:55:43.618523031 +0000 UTC m=+8645.014577054" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.620396 4885 scope.go:117] "RemoveContainer" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: E0308 21:55:43.624283 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": container with ID starting with 949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403 not found: ID does not exist" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.624348 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} err="failed to get container status \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": rpc error: code = NotFound desc = could not find container \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": container with ID starting with 949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403 not found: ID does not exist" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.653672 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.663968 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.678302 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: E0308 21:55:43.678992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.679062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.679315 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.680074 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.683391 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.689902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.774910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.774988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.775152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.876997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.877125 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.877162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.882789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.887514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.902762 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.005342 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.505870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.606821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerStarted","Data":"5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4"} Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.608677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0","Type":"ContainerStarted","Data":"332381c985ef489b5b2b2c43b5d02ad256fdc8ee96097b248e95d8cf8d80eac4"} Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.642108 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" podStartSLOduration=3.174973741 podStartE2EDuration="3.642087457s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="2026-03-08 21:55:42.802642977 +0000 UTC m=+8644.198697000" lastFinishedPulling="2026-03-08 21:55:43.269756693 +0000 UTC m=+8644.665810716" observedRunningTime="2026-03-08 21:55:44.629420989 +0000 UTC m=+8646.025475032" watchObservedRunningTime="2026-03-08 21:55:44.642087457 +0000 UTC m=+8646.038141480" Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.387957 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" path="/var/lib/kubelet/pods/e5c41752-6a6f-4bbf-882f-a1e873cd225f/volumes" Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.643570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0","Type":"ContainerStarted","Data":"546e0708e2f027193a01bfb4b1f0bf3f5cfa6cb7968bc0ce004add563a9451aa"} Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.674248 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674231253 podStartE2EDuration="2.674231253s" podCreationTimestamp="2026-03-08 21:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:45.663841825 +0000 UTC m=+8647.059895848" watchObservedRunningTime="2026-03-08 21:55:45.674231253 +0000 UTC m=+8647.070285276" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.109890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.865531 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.865650 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:55:48 crc kubenswrapper[4885]: I0308 21:55:48.883012 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:49 crc kubenswrapper[4885]: I0308 21:55:49.005902 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.837593 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.837985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.866009 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.866139 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031164 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a2d37f17-3e80-43b1-b6e3-df2316900973" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.45:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031532 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a2d37f17-3e80-43b1-b6e3-df2316900973" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.45:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031489 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afd37ef2-90bf-4ea4-86a1-2113a005824e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031896 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afd37ef2-90bf-4ea4-86a1-2113a005824e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.005529 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.053679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.839087 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.158801 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.161584 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.164968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.165083 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.166735 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.180175 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.266323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.369242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.400138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.489717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.078529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:01 crc kubenswrapper[4885]: W0308 21:56:01.078636 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod480855ed_5f7f_4fb4_99dc_ced66ce15999.slice/crio-676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6 WatchSource:0}: Error finding container 676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6: Status 404 returned error can't find the container with id 676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6 Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.846487 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.847829 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.853841 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.858789 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.869849 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.876423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.885180 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.916047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerStarted","Data":"676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6"} Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.916370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.926640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.930671 4885 generic.go:334] "Generic (PLEG): container finished" podID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerID="6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f" exitCode=0 Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.930760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerDied","Data":"6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f"} Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.933460 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.445947 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.572899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"480855ed-5f7f-4fb4-99dc-ced66ce15999\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.579405 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj" (OuterVolumeSpecName: "kube-api-access-99zwj") pod "480855ed-5f7f-4fb4-99dc-ced66ce15999" (UID: "480855ed-5f7f-4fb4-99dc-ced66ce15999"). InnerVolumeSpecName "kube-api-access-99zwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.675298 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") on node \"crc\" DevicePath \"\"" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerDied","Data":"676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6"} Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963619 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6" Mar 08 21:56:05 crc kubenswrapper[4885]: I0308 21:56:05.532258 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:56:05 crc kubenswrapper[4885]: I0308 21:56:05.543879 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:56:07 crc kubenswrapper[4885]: I0308 21:56:07.387854 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" path="/var/lib/kubelet/pods/baf0e32a-60d4-4a44-af91-6bbe65bc82c9/volumes" Mar 08 21:56:14 crc kubenswrapper[4885]: I0308 21:56:14.174197 4885 scope.go:117] "RemoveContainer" containerID="6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993" Mar 08 21:56:32 crc kubenswrapper[4885]: I0308 21:56:32.818674 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:56:32 crc kubenswrapper[4885]: I0308 21:56:32.819205 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:02 crc kubenswrapper[4885]: I0308 21:57:02.818392 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:57:02 crc kubenswrapper[4885]: I0308 21:57:02.819145 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.818951 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.819637 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.819704 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.820944 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.821020 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" gracePeriod=600 Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.555984 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" exitCode=0 Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556582 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.148589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:00 crc kubenswrapper[4885]: E0308 21:58:00.149562 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.149574 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.149817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.150525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.156398 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.157030 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.159299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.164431 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.272330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.374050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.392807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.489625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.972870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:01 crc kubenswrapper[4885]: I0308 21:58:01.156762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerStarted","Data":"e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731"} Mar 08 21:58:02 crc kubenswrapper[4885]: I0308 21:58:02.165876 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerStarted","Data":"0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135"} Mar 08 21:58:02 crc kubenswrapper[4885]: I0308 21:58:02.187599 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550118-28jxb" podStartSLOduration=1.359045845 podStartE2EDuration="2.187581157s" podCreationTimestamp="2026-03-08 21:58:00 +0000 UTC" firstStartedPulling="2026-03-08 21:58:00.975839039 +0000 UTC m=+8782.371893062" lastFinishedPulling="2026-03-08 21:58:01.804374341 +0000 UTC m=+8783.200428374" observedRunningTime="2026-03-08 21:58:02.178658779 +0000 UTC m=+8783.574712802" watchObservedRunningTime="2026-03-08 21:58:02.187581157 +0000 UTC m=+8783.583635180" Mar 08 21:58:03 crc kubenswrapper[4885]: I0308 21:58:03.179210 4885 generic.go:334] "Generic (PLEG): container finished" podID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerID="0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135" exitCode=0 Mar 08 21:58:03 crc kubenswrapper[4885]: I0308 21:58:03.179302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerDied","Data":"0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135"} Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.609550 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.793150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.800903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms" (OuterVolumeSpecName: "kube-api-access-7twms") pod "0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" (UID: "0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc"). InnerVolumeSpecName "kube-api-access-7twms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.897375 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") on node \"crc\" DevicePath \"\"" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerDied","Data":"e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731"} Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225255 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225353 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.266712 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.280355 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.387990 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" path="/var/lib/kubelet/pods/2aae4f06-bf3b-4963-92b4-9dfc6bb69621/volumes" Mar 08 21:58:14 crc kubenswrapper[4885]: I0308 21:58:14.427682 4885 scope.go:117] "RemoveContainer" containerID="8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa" Mar 08 21:59:33 crc kubenswrapper[4885]: I0308 21:59:33.685213 4885 generic.go:334] "Generic (PLEG): container finished" podID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerID="5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4" exitCode=0 Mar 08 21:59:33 crc kubenswrapper[4885]: I0308 21:59:33.685344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerDied","Data":"5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4"} Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.180917 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354808 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354949 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354968 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355080 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355123 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355165 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.359771 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2" (OuterVolumeSpecName: "kube-api-access-gwdl2") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "kube-api-access-gwdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.360041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph" (OuterVolumeSpecName: "ceph") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.374431 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.392022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.393161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.398024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory" (OuterVolumeSpecName: "inventory") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.405333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.406062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.412160 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.418096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.424725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.431551 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.431908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458117 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458174 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458195 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458214 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458233 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458251 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458269 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458289 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458324 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458342 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458362 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458384 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458403 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerDied","Data":"bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2"} Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712351 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.677423 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:51 crc kubenswrapper[4885]: E0308 21:59:51.678800 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.678823 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: E0308 21:59:51.678877 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.678892 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.679353 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.679420 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.683339 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.697418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.844610 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.844618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.864000 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.004296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.498696 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924296 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" exitCode=0 Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64"} Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"99a2eed0975662340cabb29646db1c9fc839d1e4761a0358fe8a7e592a0111d3"} Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.926749 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:59:53 crc kubenswrapper[4885]: I0308 21:59:53.939245 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} Mar 08 21:59:56 crc kubenswrapper[4885]: I0308 21:59:56.976231 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" exitCode=0 Mar 08 21:59:56 crc kubenswrapper[4885]: I0308 21:59:56.976289 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.008785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.038795 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lzz29" podStartSLOduration=2.836654887 podStartE2EDuration="9.038777536s" podCreationTimestamp="2026-03-08 21:59:51 +0000 UTC" firstStartedPulling="2026-03-08 21:59:52.926565049 +0000 UTC m=+8894.322619062" lastFinishedPulling="2026-03-08 21:59:59.128687688 +0000 UTC m=+8900.524741711" observedRunningTime="2026-03-08 22:00:00.031211815 +0000 UTC m=+8901.427265838" watchObservedRunningTime="2026-03-08 22:00:00.038777536 +0000 UTC m=+8901.434831559" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.147448 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.149318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.151249 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.151971 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.152155 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.157064 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.158953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.160161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.160301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.166180 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.175829 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235940 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.236113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.339214 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.348217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.353887 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.358121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.472623 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.487059 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: W0308 22:00:00.990897 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf80a103_8bbc_4c66_8995_05152b8b9b66.slice/crio-e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639 WatchSource:0}: Error finding container e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639: Status 404 returned error can't find the container with id e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639 Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.003380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.028114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerStarted","Data":"e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639"} Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.079093 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:01 crc kubenswrapper[4885]: W0308 22:00:01.088469 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d5ced5_8334_4732_bc13_a8fbf2e27acf.slice/crio-6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a WatchSource:0}: Error finding container 6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a: Status 404 returned error can't find the container with id 6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.004950 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.005199 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040777 4885 generic.go:334] "Generic (PLEG): container finished" podID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerID="8582316db402e0c93eabdefa470a1f0e233e7e0f8d71ac1dcb4dcae392602245" exitCode=0 Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerDied","Data":"8582316db402e0c93eabdefa470a1f0e233e7e0f8d71ac1dcb4dcae392602245"} Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerStarted","Data":"6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a"} Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.818114 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.818489 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.063598 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lzz29" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" probeResult="failure" output=< Mar 08 22:00:03 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:00:03 crc kubenswrapper[4885]: > Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.545341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633284 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633441 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.634096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume" (OuterVolumeSpecName: "config-volume") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.635352 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.640119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.641416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df" (OuterVolumeSpecName: "kube-api-access-8r6df") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "kube-api-access-8r6df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.737063 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.737098 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085654 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerDied","Data":"6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a"} Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085906 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085940 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.642710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.653036 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 22:00:05 crc kubenswrapper[4885]: I0308 22:00:05.379592 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" path="/var/lib/kubelet/pods/e7dec2e5-804e-4bc5-99cc-370c31d352e0/volumes" Mar 08 22:00:07 crc kubenswrapper[4885]: I0308 22:00:07.119998 4885 generic.go:334] "Generic (PLEG): container finished" podID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerID="c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347" exitCode=0 Mar 08 22:00:07 crc kubenswrapper[4885]: I0308 22:00:07.120058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerDied","Data":"c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347"} Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.531203 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.658778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"df80a103-8bbc-4c66-8995-05152b8b9b66\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.664811 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5" (OuterVolumeSpecName: "kube-api-access-rvbf5") pod "df80a103-8bbc-4c66-8995-05152b8b9b66" (UID: "df80a103-8bbc-4c66-8995-05152b8b9b66"). InnerVolumeSpecName "kube-api-access-rvbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.761491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.142904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerDied","Data":"e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639"} Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.143401 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.143031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.604059 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.613692 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 22:00:11 crc kubenswrapper[4885]: I0308 22:00:11.385448 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28077c49-e447-4b53-ab0a-078b678e322e" path="/var/lib/kubelet/pods/28077c49-e447-4b53-ab0a-078b678e322e/volumes" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.081614 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.150388 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.337503 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.186200 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lzz29" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" containerID="cri-o://1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" gracePeriod=2 Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.709651 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.803944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.804012 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.804137 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.830529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities" (OuterVolumeSpecName: "utilities") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.840171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv" (OuterVolumeSpecName: "kube-api-access-zq8gv") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "kube-api-access-zq8gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.906768 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.906814 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.962521 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.008707 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199583 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" exitCode=0 Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199676 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.200060 4885 scope.go:117] "RemoveContainer" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.200038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"99a2eed0975662340cabb29646db1c9fc839d1e4761a0358fe8a7e592a0111d3"} Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.233883 4885 scope.go:117] "RemoveContainer" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.267713 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.290986 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.306139 4885 scope.go:117] "RemoveContainer" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.332904 4885 scope.go:117] "RemoveContainer" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.333688 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": container with ID starting with 1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f not found: ID does not exist" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.333736 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} err="failed to get container status \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": rpc error: code = NotFound desc = could not find container \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": container with ID starting with 1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.333769 4885 scope.go:117] "RemoveContainer" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.334176 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": container with ID starting with 42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907 not found: ID does not exist" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334236 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} err="failed to get container status \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": rpc error: code = NotFound desc = could not find container \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": container with ID starting with 42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907 not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334274 4885 scope.go:117] "RemoveContainer" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.334837 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": container with ID starting with c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64 not found: ID does not exist" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334873 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64"} err="failed to get container status \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": rpc error: code = NotFound desc = could not find container \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": container with ID starting with c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64 not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.548563 4885 scope.go:117] "RemoveContainer" containerID="b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.610689 4885 scope.go:117] "RemoveContainer" containerID="3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d" Mar 08 22:00:15 crc kubenswrapper[4885]: I0308 22:00:15.389663 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" path="/var/lib/kubelet/pods/54512cac-df49-4ce5-aeea-b3f205d06de8/volumes" Mar 08 22:00:32 crc kubenswrapper[4885]: I0308 22:00:32.818439 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:00:32 crc kubenswrapper[4885]: I0308 22:00:32.819141 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.158036 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-content" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159163 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-content" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159180 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159188 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159203 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-utilities" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159210 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-utilities" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159226 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159256 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159443 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159454 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159461 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.160204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.176755 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358270 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.359034 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461733 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.475985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.476020 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.480792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.499339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.515826 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.833579 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.802400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerStarted","Data":"5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd"} Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.802749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerStarted","Data":"0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879"} Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.836320 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550121-m29ck" podStartSLOduration=1.8362786789999999 podStartE2EDuration="1.836278679s" podCreationTimestamp="2026-03-08 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 22:01:01.824777191 +0000 UTC m=+8963.220831284" watchObservedRunningTime="2026-03-08 22:01:01.836278679 +0000 UTC m=+8963.232332742" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.817876 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818129 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818163 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818593 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818633 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" gracePeriod=600 Mar 08 22:01:02 crc kubenswrapper[4885]: E0308 22:01:02.956004 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823540 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" exitCode=0 Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823683 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823876 4885 scope.go:117] "RemoveContainer" containerID="22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.824617 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:03 crc kubenswrapper[4885]: E0308 22:01:03.825165 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:04 crc kubenswrapper[4885]: E0308 22:01:04.033171 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:41494->38.102.83.80:33667: write tcp 38.102.83.80:41494->38.102.83.80:33667: write: broken pipe Mar 08 22:01:04 crc kubenswrapper[4885]: I0308 22:01:04.835039 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerID="5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd" exitCode=0 Mar 08 22:01:04 crc kubenswrapper[4885]: I0308 22:01:04.835072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerDied","Data":"5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd"} Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.335544 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.396650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.396980 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.397054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.397084 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.404146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw" (OuterVolumeSpecName: "kube-api-access-vvrfw") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "kube-api-access-vvrfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.405380 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.435836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.467268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data" (OuterVolumeSpecName: "config-data") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499600 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499632 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499643 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499650 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863653 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerDied","Data":"0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879"} Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863705 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863779 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:18 crc kubenswrapper[4885]: I0308 22:01:18.368697 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:18 crc kubenswrapper[4885]: E0308 22:01:18.369797 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:29 crc kubenswrapper[4885]: I0308 22:01:29.382262 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:29 crc kubenswrapper[4885]: E0308 22:01:29.383178 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:44 crc kubenswrapper[4885]: I0308 22:01:44.392524 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:44 crc kubenswrapper[4885]: E0308 22:01:44.393400 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:58 crc kubenswrapper[4885]: I0308 22:01:58.369536 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:58 crc kubenswrapper[4885]: E0308 22:01:58.370377 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.153732 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:00 crc kubenswrapper[4885]: E0308 22:02:00.154522 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.154535 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.154759 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.155551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.158684 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.160216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.160904 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.166640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.306227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.409856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.446895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.508282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:01 crc kubenswrapper[4885]: W0308 22:02:01.015615 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd3f7a8_4264_431e_b87b_9a60f7133767.slice/crio-f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3 WatchSource:0}: Error finding container f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3: Status 404 returned error can't find the container with id f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3 Mar 08 22:02:01 crc kubenswrapper[4885]: I0308 22:02:01.019258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:01 crc kubenswrapper[4885]: I0308 22:02:01.663032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerStarted","Data":"f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3"} Mar 08 22:02:02 crc kubenswrapper[4885]: I0308 22:02:02.683031 4885 generic.go:334] "Generic (PLEG): container finished" podID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerID="585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430" exitCode=0 Mar 08 22:02:02 crc kubenswrapper[4885]: I0308 22:02:02.683108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerDied","Data":"585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430"} Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.154857 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.300871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"9fd3f7a8-4264-431e-b87b-9a60f7133767\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.309162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn" (OuterVolumeSpecName: "kube-api-access-t2zzn") pod "9fd3f7a8-4264-431e-b87b-9a60f7133767" (UID: "9fd3f7a8-4264-431e-b87b-9a60f7133767"). InnerVolumeSpecName "kube-api-access-t2zzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.403704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerDied","Data":"f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3"} Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707427 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707485 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.297747 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.305893 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.389091 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" path="/var/lib/kubelet/pods/480855ed-5f7f-4fb4-99dc-ced66ce15999/volumes" Mar 08 22:02:13 crc kubenswrapper[4885]: I0308 22:02:13.373557 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:13 crc kubenswrapper[4885]: E0308 22:02:13.374451 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:14 crc kubenswrapper[4885]: I0308 22:02:14.761537 4885 scope.go:117] "RemoveContainer" containerID="6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f" Mar 08 22:02:15 crc kubenswrapper[4885]: E0308 22:02:15.059579 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:58736->38.102.83.80:33667: write tcp 38.102.83.80:58736->38.102.83.80:33667: write: broken pipe Mar 08 22:02:24 crc kubenswrapper[4885]: I0308 22:02:24.340309 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:24 crc kubenswrapper[4885]: I0308 22:02:24.341456 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" containerID="cri-o://ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" gracePeriod=30 Mar 08 22:02:27 crc kubenswrapper[4885]: I0308 22:02:27.370801 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:27 crc kubenswrapper[4885]: E0308 22:02:27.372390 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:40 crc kubenswrapper[4885]: I0308 22:02:40.369404 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:40 crc kubenswrapper[4885]: E0308 22:02:40.370391 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:52 crc kubenswrapper[4885]: I0308 22:02:52.369165 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:52 crc kubenswrapper[4885]: E0308 22:02:52.370251 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:54 crc kubenswrapper[4885]: I0308 22:02:54.430866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerDied","Data":"ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69"} Mar 08 22:02:54 crc kubenswrapper[4885]: I0308 22:02:54.430732 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerID="ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" exitCode=137 Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.007413 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.069357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.089741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk" (OuterVolumeSpecName: "kube-api-access-scrbk") pod "1a10ccbd-e30c-478f-84a4-c869a8cd0924" (UID: "1a10ccbd-e30c-478f-84a4-c869a8cd0924"). InnerVolumeSpecName "kube-api-access-scrbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.171652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.172650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.195250 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019" (OuterVolumeSpecName: "mariadb-data") pod "1a10ccbd-e30c-478f-84a4-c869a8cd0924" (UID: "1a10ccbd-e30c-478f-84a4-c869a8cd0924"). InnerVolumeSpecName "pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.275468 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") on node \"crc\" " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.305996 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.306165 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019") on node "crc" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.378274 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerDied","Data":"035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688"} Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444320 4885 scope.go:117] "RemoveContainer" containerID="ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444388 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.470971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.483693 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:56 crc kubenswrapper[4885]: I0308 22:02:56.178253 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:02:56 crc kubenswrapper[4885]: I0308 22:02:56.178516 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" containerID="cri-o://fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" gracePeriod=30 Mar 08 22:02:57 crc kubenswrapper[4885]: I0308 22:02:57.385021 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" path="/var/lib/kubelet/pods/1a10ccbd-e30c-478f-84a4-c869a8cd0924/volumes" Mar 08 22:03:06 crc kubenswrapper[4885]: I0308 22:03:06.368825 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:06 crc kubenswrapper[4885]: E0308 22:03:06.369642 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:20 crc kubenswrapper[4885]: I0308 22:03:20.368700 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:20 crc kubenswrapper[4885]: E0308 22:03:20.369705 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.736201 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840145 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.845433 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r" (OuterVolumeSpecName: "kube-api-access-fmb5r") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "kube-api-access-fmb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.847047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856452 4885 generic.go:334] "Generic (PLEG): container finished" podID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" exitCode=137 Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856504 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerDied","Data":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerDied","Data":"2b2a6f955da79537fe6939ae44e2e8e65e67c9ab78da8f292a75babe5150e678"} Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856553 4885 scope.go:117] "RemoveContainer" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856559 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.864007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f" (OuterVolumeSpecName: "ovn-data") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942880 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") on node \"crc\" " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942912 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942940 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.964212 4885 scope.go:117] "RemoveContainer" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: E0308 22:03:26.964521 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": container with ID starting with fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19 not found: ID does not exist" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.964558 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} err="failed to get container status \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": rpc error: code = NotFound desc = could not find container \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": container with ID starting with fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19 not found: ID does not exist" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.984284 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.984504 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f") on node "crc" Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.044519 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.199652 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.214082 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.411227 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" path="/var/lib/kubelet/pods/a086771f-d0fc-4265-b8ba-a414a7f6c7d0/volumes" Mar 08 22:03:35 crc kubenswrapper[4885]: I0308 22:03:35.369111 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:35 crc kubenswrapper[4885]: E0308 22:03:35.370384 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.746427 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747758 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747776 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747795 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747805 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747824 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747832 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748146 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748171 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748195 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.750274 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.781940 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874439 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977570 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977707 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.978297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.978312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.002726 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.097657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.604398 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:44 crc kubenswrapper[4885]: W0308 22:03:44.627996 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669459aa_3d2d_4661_9b8f_61559e8ddd40.slice/crio-d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9 WatchSource:0}: Error finding container d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9: Status 404 returned error can't find the container with id d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9 Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077176 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" exitCode=0 Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e"} Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9"} Mar 08 22:03:46 crc kubenswrapper[4885]: I0308 22:03:46.097164 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} Mar 08 22:03:47 crc kubenswrapper[4885]: I0308 22:03:47.108660 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" exitCode=0 Mar 08 22:03:47 crc kubenswrapper[4885]: I0308 22:03:47.108729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} Mar 08 22:03:48 crc kubenswrapper[4885]: I0308 22:03:48.123219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} Mar 08 22:03:48 crc kubenswrapper[4885]: I0308 22:03:48.144768 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66f54" podStartSLOduration=2.659669096 podStartE2EDuration="5.144749667s" podCreationTimestamp="2026-03-08 22:03:43 +0000 UTC" firstStartedPulling="2026-03-08 22:03:45.079481603 +0000 UTC m=+9126.475535626" lastFinishedPulling="2026-03-08 22:03:47.564562164 +0000 UTC m=+9128.960616197" observedRunningTime="2026-03-08 22:03:48.14184135 +0000 UTC m=+9129.537895373" watchObservedRunningTime="2026-03-08 22:03:48.144749667 +0000 UTC m=+9129.540803680" Mar 08 22:03:49 crc kubenswrapper[4885]: I0308 22:03:49.375024 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:49 crc kubenswrapper[4885]: E0308 22:03:49.375627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.098544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.099109 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.192954 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.291054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.441643 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.232216 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66f54" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" containerID="cri-o://d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" gracePeriod=2 Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.748706 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888462 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888568 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.889913 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities" (OuterVolumeSpecName: "utilities") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.896178 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4" (OuterVolumeSpecName: "kube-api-access-x4tj4") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "kube-api-access-x4tj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.913640 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991638 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991670 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258359 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" exitCode=0 Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258478 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258565 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9"} Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258610 4885 scope.go:117] "RemoveContainer" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.299612 4885 scope.go:117] "RemoveContainer" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.342010 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.346661 4885 scope.go:117] "RemoveContainer" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.360513 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.395256 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" path="/var/lib/kubelet/pods/669459aa-3d2d-4661-9b8f-61559e8ddd40/volumes" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401075 4885 scope.go:117] "RemoveContainer" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.401448 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": container with ID starting with d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393 not found: ID does not exist" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} err="failed to get container status \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": rpc error: code = NotFound desc = could not find container \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": container with ID starting with d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393 not found: ID does not exist" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401512 4885 scope.go:117] "RemoveContainer" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.401757 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": container with ID starting with d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6 not found: ID does not exist" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401794 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} err="failed to get container status \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": rpc error: code = NotFound desc = could not find container \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": container with ID starting with d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6 not found: ID does not exist" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401818 4885 scope.go:117] "RemoveContainer" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.402816 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": container with ID starting with 6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e not found: ID does not exist" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.402853 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e"} err="failed to get container status \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": rpc error: code = NotFound desc = could not find container \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": container with ID starting with 6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e not found: ID does not exist" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.141375 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142325 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142341 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142362 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-content" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142368 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-content" Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142392 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-utilities" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142398 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-utilities" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142630 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.143387 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.145721 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.145853 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.146178 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.157258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.265217 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.367005 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.400253 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.462525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:01 crc kubenswrapper[4885]: I0308 22:04:01.041903 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:01 crc kubenswrapper[4885]: W0308 22:04:01.044382 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31bb0f9f_aafb_4d40_9ef4_60deec075e85.slice/crio-806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518 WatchSource:0}: Error finding container 806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518: Status 404 returned error can't find the container with id 806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518 Mar 08 22:04:01 crc kubenswrapper[4885]: I0308 22:04:01.306425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerStarted","Data":"806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518"} Mar 08 22:04:03 crc kubenswrapper[4885]: I0308 22:04:03.337632 4885 generic.go:334] "Generic (PLEG): container finished" podID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerID="1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d" exitCode=0 Mar 08 22:04:03 crc kubenswrapper[4885]: I0308 22:04:03.337791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerDied","Data":"1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d"} Mar 08 22:04:04 crc kubenswrapper[4885]: I0308 22:04:04.368406 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:04 crc kubenswrapper[4885]: E0308 22:04:04.369137 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.065866 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.208502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.214403 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl" (OuterVolumeSpecName: "kube-api-access-lp9tl") pod "31bb0f9f-aafb-4d40-9ef4-60deec075e85" (UID: "31bb0f9f-aafb-4d40-9ef4-60deec075e85"). InnerVolumeSpecName "kube-api-access-lp9tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.310974 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") on node \"crc\" DevicePath \"\"" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerDied","Data":"806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518"} Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364069 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364394 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:06 crc kubenswrapper[4885]: I0308 22:04:06.162583 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 22:04:06 crc kubenswrapper[4885]: I0308 22:04:06.171840 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 22:04:07 crc kubenswrapper[4885]: I0308 22:04:07.402904 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" path="/var/lib/kubelet/pods/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc/volumes" Mar 08 22:04:14 crc kubenswrapper[4885]: I0308 22:04:14.918125 4885 scope.go:117] "RemoveContainer" containerID="0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135" Mar 08 22:04:17 crc kubenswrapper[4885]: I0308 22:04:17.369298 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:17 crc kubenswrapper[4885]: E0308 22:04:17.369996 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.800932 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:21 crc kubenswrapper[4885]: E0308 22:04:21.801864 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.801877 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.802092 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.811320 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.816443 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-58ck9"/"default-dockercfg-5z8m9" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.817407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58ck9"/"openshift-service-ca.crt" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.818643 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58ck9"/"kube-root-ca.crt" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.835111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.835220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.857777 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.936829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.937199 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.937485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.957288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:22 crc kubenswrapper[4885]: I0308 22:04:22.149102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:22 crc kubenswrapper[4885]: I0308 22:04:22.629113 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:23 crc kubenswrapper[4885]: I0308 22:04:23.586530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"e258e3e225eab9a4f8266b5151c349549e25ac4eb436b890df34a0b800489fe9"} Mar 08 22:04:29 crc kubenswrapper[4885]: I0308 22:04:29.380428 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:29 crc kubenswrapper[4885]: E0308 22:04:29.381718 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.680877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916"} Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.682237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.728487 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58ck9/must-gather-x7xtd" podStartSLOduration=2.910304442 podStartE2EDuration="9.728452582s" podCreationTimestamp="2026-03-08 22:04:21 +0000 UTC" firstStartedPulling="2026-03-08 22:04:22.631689989 +0000 UTC m=+9164.027744052" lastFinishedPulling="2026-03-08 22:04:29.449838169 +0000 UTC m=+9170.845892192" observedRunningTime="2026-03-08 22:04:30.712025153 +0000 UTC m=+9172.108079206" watchObservedRunningTime="2026-03-08 22:04:30.728452582 +0000 UTC m=+9172.124506645" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.185754 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.189717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.317637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.317897 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.439193 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.512577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: W0308 22:04:34.580624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d6c9d74_a8c6_4436_a834_8c339a59b15f.slice/crio-beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca WatchSource:0}: Error finding container beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca: Status 404 returned error can't find the container with id beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.740460 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerStarted","Data":"beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca"} Mar 08 22:04:40 crc kubenswrapper[4885]: I0308 22:04:40.368167 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:40 crc kubenswrapper[4885]: E0308 22:04:40.368820 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:46 crc kubenswrapper[4885]: I0308 22:04:46.856127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerStarted","Data":"409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06"} Mar 08 22:04:46 crc kubenswrapper[4885]: I0308 22:04:46.872782 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" podStartSLOduration=1.058227252 podStartE2EDuration="12.872766564s" podCreationTimestamp="2026-03-08 22:04:34 +0000 UTC" firstStartedPulling="2026-03-08 22:04:34.583958376 +0000 UTC m=+9175.980012399" lastFinishedPulling="2026-03-08 22:04:46.398497688 +0000 UTC m=+9187.794551711" observedRunningTime="2026-03-08 22:04:46.869977591 +0000 UTC m=+9188.266031634" watchObservedRunningTime="2026-03-08 22:04:46.872766564 +0000 UTC m=+9188.268820587" Mar 08 22:04:53 crc kubenswrapper[4885]: I0308 22:04:53.368821 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:53 crc kubenswrapper[4885]: E0308 22:04:53.369667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.895199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.897678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.910168 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.971716 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.971787 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.972017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.074595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.074835 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.091867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:59 crc kubenswrapper[4885]: I0308 22:04:59.859173 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.388817 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.996701 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" exitCode=0 Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.996785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609"} Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.997047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"8711031805b818de77f147045a567e51e1dcee443314467dfab3ca8218b78d52"} Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.999658 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:05:03 crc kubenswrapper[4885]: I0308 22:05:03.020772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} Mar 08 22:05:05 crc kubenswrapper[4885]: I0308 22:05:05.040657 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" exitCode=0 Mar 08 22:05:05 crc kubenswrapper[4885]: I0308 22:05:05.040745 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} Mar 08 22:05:06 crc kubenswrapper[4885]: I0308 22:05:06.060257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} Mar 08 22:05:06 crc kubenswrapper[4885]: I0308 22:05:06.088008 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdkwp" podStartSLOduration=4.635462104 podStartE2EDuration="9.087983071s" podCreationTimestamp="2026-03-08 22:04:57 +0000 UTC" firstStartedPulling="2026-03-08 22:05:00.999350918 +0000 UTC m=+9202.395404941" lastFinishedPulling="2026-03-08 22:05:05.451871885 +0000 UTC m=+9206.847925908" observedRunningTime="2026-03-08 22:05:06.079854474 +0000 UTC m=+9207.475908497" watchObservedRunningTime="2026-03-08 22:05:06.087983071 +0000 UTC m=+9207.484037124" Mar 08 22:05:08 crc kubenswrapper[4885]: I0308 22:05:08.369404 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:08 crc kubenswrapper[4885]: E0308 22:05:08.369812 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.089319 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerID="409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06" exitCode=0 Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.089416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerDied","Data":"409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06"} Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.859849 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.859885 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.940357 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.154442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.212652 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.217410 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.246024 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.255353 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.324736 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.325111 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.324855 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host" (OuterVolumeSpecName: "host") pod "3d6c9d74-a8c6-4436-a834-8c339a59b15f" (UID: "3d6c9d74-a8c6-4436-a834-8c339a59b15f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.325682 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.332868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6" (OuterVolumeSpecName: "kube-api-access-hnff6") pod "3d6c9d74-a8c6-4436-a834-8c339a59b15f" (UID: "3d6c9d74-a8c6-4436-a834-8c339a59b15f"). InnerVolumeSpecName "kube-api-access-hnff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.427491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.109370 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.109419 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.380832 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" path="/var/lib/kubelet/pods/3d6c9d74-a8c6-4436-a834-8c339a59b15f/volumes" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432017 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:11 crc kubenswrapper[4885]: E0308 22:05:11.432437 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432454 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.433412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.549880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.550168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652963 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.669597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.754281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118496 4885 generic.go:334] "Generic (PLEG): container finished" podID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerID="857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789" exitCode=0 Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" event={"ID":"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733","Type":"ContainerDied","Data":"857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789"} Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" event={"ID":"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733","Type":"ContainerStarted","Data":"890a75bcfecb4469b6bad0f0bf13d255a9327ba2fbb2b8828d4b0a1a3e7278f9"} Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118970 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdkwp" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" containerID="cri-o://af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" gracePeriod=2 Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.316110 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.329503 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.596336 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687413 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.688834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities" (OuterVolumeSpecName: "utilities") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.697568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr" (OuterVolumeSpecName: "kube-api-access-gkgfr") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "kube-api-access-gkgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.770314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791600 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791639 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791649 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129821 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" exitCode=0 Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.130192 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"8711031805b818de77f147045a567e51e1dcee443314467dfab3ca8218b78d52"} Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.130213 4885 scope.go:117] "RemoveContainer" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129892 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.234685 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.236534 4885 scope.go:117] "RemoveContainer" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.260193 4885 scope.go:117] "RemoveContainer" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.266830 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.281053 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.301266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.301443 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.303161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host" (OuterVolumeSpecName: "host") pod "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" (UID: "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.308085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f" (OuterVolumeSpecName: "kube-api-access-6924f") pod "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" (UID: "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733"). InnerVolumeSpecName "kube-api-access-6924f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.328107 4885 scope.go:117] "RemoveContainer" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.332047 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": container with ID starting with af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082 not found: ID does not exist" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.332090 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} err="failed to get container status \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": rpc error: code = NotFound desc = could not find container \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": container with ID starting with af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.332114 4885 scope.go:117] "RemoveContainer" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.336062 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": container with ID starting with 8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2 not found: ID does not exist" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.336107 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} err="failed to get container status \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": rpc error: code = NotFound desc = could not find container \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": container with ID starting with 8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.336135 4885 scope.go:117] "RemoveContainer" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.341083 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": container with ID starting with d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609 not found: ID does not exist" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.341131 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609"} err="failed to get container status \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": rpc error: code = NotFound desc = could not find container \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": container with ID starting with d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.378622 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" path="/var/lib/kubelet/pods/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733/volumes" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.379151 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" path="/var/lib/kubelet/pods/ce29cc99-ca85-4a7b-b027-1bc84fa92252/volumes" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.404095 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.404123 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.612935 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613383 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-utilities" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-utilities" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613420 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613426 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613441 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613468 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-content" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613473 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-content" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613663 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613680 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.614403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.710707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.711147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813437 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.831607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.932241 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: W0308 22:05:13.962260 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcca738_8182_42d8_83d2_693323d43424.slice/crio-5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27 WatchSource:0}: Error finding container 5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27: Status 404 returned error can't find the container with id 5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27 Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.141043 4885 scope.go:117] "RemoveContainer" containerID="857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789" Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.141075 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.142319 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-xq57v" event={"ID":"3bcca738-8182-42d8-83d2-693323d43424","Type":"ContainerStarted","Data":"5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27"} Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.155059 4885 generic.go:334] "Generic (PLEG): container finished" podID="3bcca738-8182-42d8-83d2-693323d43424" containerID="3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298" exitCode=0 Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.155150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-xq57v" event={"ID":"3bcca738-8182-42d8-83d2-693323d43424","Type":"ContainerDied","Data":"3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298"} Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.207066 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.211470 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.278515 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.374785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"3bcca738-8182-42d8-83d2-693323d43424\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.375054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"3bcca738-8182-42d8-83d2-693323d43424\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.375673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host" (OuterVolumeSpecName: "host") pod "3bcca738-8182-42d8-83d2-693323d43424" (UID: "3bcca738-8182-42d8-83d2-693323d43424"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.380456 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd" (OuterVolumeSpecName: "kube-api-access-2pdcd") pod "3bcca738-8182-42d8-83d2-693323d43424" (UID: "3bcca738-8182-42d8-83d2-693323d43424"). InnerVolumeSpecName "kube-api-access-2pdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.476954 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.476980 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.175756 4885 scope.go:117] "RemoveContainer" containerID="3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.175794 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.380149 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcca738-8182-42d8-83d2-693323d43424" path="/var/lib/kubelet/pods/3bcca738-8182-42d8-83d2-693323d43424/volumes" Mar 08 22:05:23 crc kubenswrapper[4885]: I0308 22:05:23.368406 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:23 crc kubenswrapper[4885]: E0308 22:05:23.369123 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:36 crc kubenswrapper[4885]: I0308 22:05:36.368768 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:36 crc kubenswrapper[4885]: E0308 22:05:36.369479 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.136990 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:42 crc kubenswrapper[4885]: E0308 22:05:42.139655 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.139790 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.140270 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.142398 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.157342 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.349013 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.349175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.367735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.485782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.042641 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.555833 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" exitCode=0 Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.556009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9"} Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.556315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"e7fc46e6a9bf19dbb89a694e6bf530cbb7a7bbe680bbca8623dbeae1349edcfd"} Mar 08 22:05:44 crc kubenswrapper[4885]: I0308 22:05:44.604604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} Mar 08 22:05:46 crc kubenswrapper[4885]: I0308 22:05:46.634382 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" exitCode=0 Mar 08 22:05:46 crc kubenswrapper[4885]: I0308 22:05:46.634512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} Mar 08 22:05:47 crc kubenswrapper[4885]: I0308 22:05:47.650600 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} Mar 08 22:05:47 crc kubenswrapper[4885]: I0308 22:05:47.680679 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p99b8" podStartSLOduration=1.938572341 podStartE2EDuration="5.680651908s" podCreationTimestamp="2026-03-08 22:05:42 +0000 UTC" firstStartedPulling="2026-03-08 22:05:43.559027762 +0000 UTC m=+9244.955081795" lastFinishedPulling="2026-03-08 22:05:47.301107339 +0000 UTC m=+9248.697161362" observedRunningTime="2026-03-08 22:05:47.674111884 +0000 UTC m=+9249.070165907" watchObservedRunningTime="2026-03-08 22:05:47.680651908 +0000 UTC m=+9249.076705951" Mar 08 22:05:48 crc kubenswrapper[4885]: I0308 22:05:48.369507 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:48 crc kubenswrapper[4885]: E0308 22:05:48.370402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.486071 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.486629 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.580958 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.815843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.882540 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:54 crc kubenswrapper[4885]: I0308 22:05:54.738403 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p99b8" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" containerID="cri-o://ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" gracePeriod=2 Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.304360 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379091 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379467 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.380554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities" (OuterVolumeSpecName: "utilities") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.388389 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj" (OuterVolumeSpecName: "kube-api-access-zq2qj") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "kube-api-access-zq2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.459674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.482565 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.483063 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.483248 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753083 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" exitCode=0 Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753204 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753240 4885 scope.go:117] "RemoveContainer" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"e7fc46e6a9bf19dbb89a694e6bf530cbb7a7bbe680bbca8623dbeae1349edcfd"} Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.777153 4885 scope.go:117] "RemoveContainer" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.810309 4885 scope.go:117] "RemoveContainer" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.819260 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.840553 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.870328 4885 scope.go:117] "RemoveContainer" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.870961 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": container with ID starting with ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e not found: ID does not exist" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871040 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} err="failed to get container status \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": rpc error: code = NotFound desc = could not find container \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": container with ID starting with ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e not found: ID does not exist" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871083 4885 scope.go:117] "RemoveContainer" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.871729 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": container with ID starting with 68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a not found: ID does not exist" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871780 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} err="failed to get container status \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": rpc error: code = NotFound desc = could not find container \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": container with ID starting with 68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a not found: ID does not exist" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871807 4885 scope.go:117] "RemoveContainer" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.872218 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": container with ID starting with ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9 not found: ID does not exist" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.872247 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9"} err="failed to get container status \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": rpc error: code = NotFound desc = could not find container \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": container with ID starting with ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9 not found: ID does not exist" Mar 08 22:05:57 crc kubenswrapper[4885]: I0308 22:05:57.385259 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" path="/var/lib/kubelet/pods/9302823a-d143-49a4-9fcc-1e27bcd7ecd4/volumes" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.152485 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153451 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-content" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153473 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-content" Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153514 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-utilities" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-utilities" Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153556 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153571 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153964 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.155228 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161617 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161645 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.162223 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.296434 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.398652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.430582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.479824 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.833314 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:01 crc kubenswrapper[4885]: I0308 22:06:01.823702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerStarted","Data":"60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a"} Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.368640 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:06:02 crc kubenswrapper[4885]: E0308 22:06:02.369632 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.844097 4885 generic.go:334] "Generic (PLEG): container finished" podID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerID="78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09" exitCode=0 Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.844167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerDied","Data":"78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09"} Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.361257 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.544326 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.554369 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8" (OuterVolumeSpecName: "kube-api-access-gfqv8") pod "db2ac46a-e5ce-45f0-8d95-2f520eebd199" (UID: "db2ac46a-e5ce-45f0-8d95-2f520eebd199"). InnerVolumeSpecName "kube-api-access-gfqv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.648214 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") on node \"crc\" DevicePath \"\"" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.872972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerDied","Data":"60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a"} Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.873028 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.873061 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:05 crc kubenswrapper[4885]: I0308 22:06:05.463205 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:06:05 crc kubenswrapper[4885]: I0308 22:06:05.475228 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:06:07 crc kubenswrapper[4885]: I0308 22:06:07.381856 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" path="/var/lib/kubelet/pods/df80a103-8bbc-4c66-8995-05152b8b9b66/volumes" Mar 08 22:06:13 crc kubenswrapper[4885]: I0308 22:06:13.369767 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:06:14 crc kubenswrapper[4885]: I0308 22:06:14.013760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} Mar 08 22:06:15 crc kubenswrapper[4885]: I0308 22:06:15.129474 4885 scope.go:117] "RemoveContainer" containerID="c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.148195 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:00 crc kubenswrapper[4885]: E0308 22:08:00.161261 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.161298 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.163746 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.165893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.170745 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.174742 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.181879 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.185797 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.324156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.425837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.444839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.517543 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.994872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:01 crc kubenswrapper[4885]: I0308 22:08:01.402986 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerStarted","Data":"8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6"} Mar 08 22:08:02 crc kubenswrapper[4885]: I0308 22:08:02.413349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerStarted","Data":"359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c"} Mar 08 22:08:02 crc kubenswrapper[4885]: I0308 22:08:02.433452 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" podStartSLOduration=1.5373337710000001 podStartE2EDuration="2.433427137s" podCreationTimestamp="2026-03-08 22:08:00 +0000 UTC" firstStartedPulling="2026-03-08 22:08:00.993536798 +0000 UTC m=+9382.389590831" lastFinishedPulling="2026-03-08 22:08:01.889630164 +0000 UTC m=+9383.285684197" observedRunningTime="2026-03-08 22:08:02.42528097 +0000 UTC m=+9383.821335003" watchObservedRunningTime="2026-03-08 22:08:02.433427137 +0000 UTC m=+9383.829481170" Mar 08 22:08:03 crc kubenswrapper[4885]: I0308 22:08:03.426311 4885 generic.go:334] "Generic (PLEG): container finished" podID="16108583-f398-4571-9e1c-41d86a071331" containerID="359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c" exitCode=0 Mar 08 22:08:03 crc kubenswrapper[4885]: I0308 22:08:03.426418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerDied","Data":"359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c"} Mar 08 22:08:04 crc kubenswrapper[4885]: I0308 22:08:04.931066 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.035811 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"16108583-f398-4571-9e1c-41d86a071331\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.042503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8" (OuterVolumeSpecName: "kube-api-access-68bb8") pod "16108583-f398-4571-9e1c-41d86a071331" (UID: "16108583-f398-4571-9e1c-41d86a071331"). InnerVolumeSpecName "kube-api-access-68bb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.142487 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") on node \"crc\" DevicePath \"\"" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerDied","Data":"8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6"} Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475427 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.535418 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.549730 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:08:07 crc kubenswrapper[4885]: I0308 22:08:07.392508 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" path="/var/lib/kubelet/pods/9fd3f7a8-4264-431e-b87b-9a60f7133767/volumes" Mar 08 22:08:15 crc kubenswrapper[4885]: I0308 22:08:15.342869 4885 scope.go:117] "RemoveContainer" containerID="585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430" Mar 08 22:08:32 crc kubenswrapper[4885]: I0308 22:08:32.818481 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:08:32 crc kubenswrapper[4885]: I0308 22:08:32.819049 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:02 crc kubenswrapper[4885]: I0308 22:09:02.822966 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:09:02 crc kubenswrapper[4885]: I0308 22:09:02.823652 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818188 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818801 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818859 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.820013 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.820108 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" gracePeriod=600 Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297047 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" exitCode=0 Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297569 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.169589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:00 crc kubenswrapper[4885]: E0308 22:10:00.171316 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.171337 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.171607 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.172626 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182851 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182907 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182989 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.186202 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.286638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.389382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.416064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.505480 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:01 crc kubenswrapper[4885]: W0308 22:10:01.100820 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bf31f87_6e2d_4ae5_81e7_e3d501dc03d6.slice/crio-e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7 WatchSource:0}: Error finding container e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7: Status 404 returned error can't find the container with id e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7 Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.105545 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.115962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.624823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerStarted","Data":"e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7"} Mar 08 22:10:02 crc kubenswrapper[4885]: I0308 22:10:02.638753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerStarted","Data":"1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873"} Mar 08 22:10:02 crc kubenswrapper[4885]: I0308 22:10:02.668966 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550130-q8btr" podStartSLOduration=1.531450139 podStartE2EDuration="2.668892527s" podCreationTimestamp="2026-03-08 22:10:00 +0000 UTC" firstStartedPulling="2026-03-08 22:10:01.105283415 +0000 UTC m=+9502.501337438" lastFinishedPulling="2026-03-08 22:10:02.242725803 +0000 UTC m=+9503.638779826" observedRunningTime="2026-03-08 22:10:02.657591235 +0000 UTC m=+9504.053645298" watchObservedRunningTime="2026-03-08 22:10:02.668892527 +0000 UTC m=+9504.064946590" Mar 08 22:10:03 crc kubenswrapper[4885]: I0308 22:10:03.657352 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerID="1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873" exitCode=0 Mar 08 22:10:03 crc kubenswrapper[4885]: I0308 22:10:03.657640 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerDied","Data":"1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873"} Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.168624 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.227529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.241156 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z" (OuterVolumeSpecName: "kube-api-access-rx47z") pod "8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" (UID: "8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6"). InnerVolumeSpecName "kube-api-access-rx47z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.329517 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") on node \"crc\" DevicePath \"\"" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.691672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerDied","Data":"e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7"} Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.692101 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.692194 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.752537 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.764193 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:10:07 crc kubenswrapper[4885]: I0308 22:10:07.385055 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" path="/var/lib/kubelet/pods/31bb0f9f-aafb-4d40-9ef4-60deec075e85/volumes" Mar 08 22:10:15 crc kubenswrapper[4885]: I0308 22:10:15.470277 4885 scope.go:117] "RemoveContainer" containerID="1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d" Mar 08 22:11:15 crc kubenswrapper[4885]: I0308 22:11:15.605890 4885 scope.go:117] "RemoveContainer" containerID="409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.161299 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:00 crc kubenswrapper[4885]: E0308 22:12:00.162736 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.162760 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.163216 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.164648 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168571 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168777 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168976 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.182449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.255418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.358008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.381563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.501367 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:01 crc kubenswrapper[4885]: I0308 22:12:01.033521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:01 crc kubenswrapper[4885]: I0308 22:12:01.321363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerStarted","Data":"805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c"} Mar 08 22:12:02 crc kubenswrapper[4885]: I0308 22:12:02.818459 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:12:02 crc kubenswrapper[4885]: I0308 22:12:02.819001 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:12:03 crc kubenswrapper[4885]: I0308 22:12:03.354671 4885 generic.go:334] "Generic (PLEG): container finished" podID="5350f846-ee1f-400b-8579-de1a56050f02" containerID="381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2" exitCode=0 Mar 08 22:12:03 crc kubenswrapper[4885]: I0308 22:12:03.354724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerDied","Data":"381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2"} Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.759151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.888583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"5350f846-ee1f-400b-8579-de1a56050f02\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.895261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t" (OuterVolumeSpecName: "kube-api-access-lsm5t") pod "5350f846-ee1f-400b-8579-de1a56050f02" (UID: "5350f846-ee1f-400b-8579-de1a56050f02"). InnerVolumeSpecName "kube-api-access-lsm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.992169 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") on node \"crc\" DevicePath \"\"" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.378974 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.393157 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerDied","Data":"805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c"} Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.393218 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.852197 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.865246 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:12:07 crc kubenswrapper[4885]: I0308 22:12:07.384148 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" path="/var/lib/kubelet/pods/db2ac46a-e5ce-45f0-8d95-2f520eebd199/volumes" Mar 08 22:12:15 crc kubenswrapper[4885]: I0308 22:12:15.700776 4885 scope.go:117] "RemoveContainer" containerID="78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09" Mar 08 22:12:32 crc kubenswrapper[4885]: I0308 22:12:32.818199 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:12:32 crc kubenswrapper[4885]: I0308 22:12:32.819025 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818217 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818897 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818973 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.819592 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.819659 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" gracePeriod=600 Mar 08 22:13:02 crc kubenswrapper[4885]: E0308 22:13:02.975194 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083634 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" exitCode=0 Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083753 4885 scope.go:117] "RemoveContainer" containerID="476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.084826 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:03 crc kubenswrapper[4885]: E0308 22:13:03.085589 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:15 crc kubenswrapper[4885]: I0308 22:13:15.368807 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:15 crc kubenswrapper[4885]: E0308 22:13:15.369480 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.685091 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/init-config-reloader/0.log" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.907330 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/alertmanager/0.log" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.961182 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/init-config-reloader/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.000694 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/config-reloader/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.224468 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-api/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.236149 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-evaluator/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.324419 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-listener/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.466351 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-notifier/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.483031 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8469b78fd4-9xh8z_ebfd95bc-213c-417c-8dd5-b66637bd98e9/barbican-api/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.549394 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8469b78fd4-9xh8z_ebfd95bc-213c-417c-8dd5-b66637bd98e9/barbican-api-log/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.755949 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c754cdbdb-h7rpz_a8bdb095-595a-458e-870f-41fea2999d18/barbican-keystone-listener-log/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.759293 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c754cdbdb-h7rpz_a8bdb095-595a-458e-870f-41fea2999d18/barbican-keystone-listener/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.924525 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-755df7d9d5-kl4vq_b7b24c26-4c9a-4442-a124-a66987404ec8/barbican-worker/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.019563 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-755df7d9d5-kl4vq_b7b24c26-4c9a-4442-a124-a66987404ec8/barbican-worker-log/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.087210 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-bcvmz_51b71742-3986-42a4-a016-eeecb3a7ba16/bootstrap-openstack-openstack-cell1/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.231177 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/ceilometer-central-agent/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.299833 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/proxy-httpd/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.301847 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/ceilometer-notification-agent/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.456229 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/sg-core/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.520519 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-jr59q_9ef426ef-0010-4b6f-8b94-b45e726c2f02/ceph-client-openstack-openstack-cell1/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.713059 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_216289ea-1f99-4924-aa6b-9951b3b3840e/cinder-api/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.758595 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_216289ea-1f99-4924-aa6b-9951b3b3840e/cinder-api-log/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.001483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5eb198a5-6241-48b0-bc8c-57ad764a1f3b/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.040532 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5eb198a5-6241-48b0-bc8c-57ad764a1f3b/cinder-backup/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.086953 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7620c463-ffe0-4d70-ba82-deaef34da248/cinder-scheduler/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.242707 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7620c463-ffe0-4d70-ba82-deaef34da248/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.335002 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_954ec951-d955-4335-93bb-d43e59408ae3/cinder-volume/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.406214 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_954ec951-d955-4335-93bb-d43e59408ae3/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.560374 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-s5bq4_cd7ac915-62c8-4d95-96a3-899c245e685c/configure-network-openstack-openstack-cell1/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.658783 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-9wz6s_eaa3ffc5-e09f-48b4-96b2-e2454bfe6251/configure-os-openstack-openstack-cell1/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.841101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/init/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.026167 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/init/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.057101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-wl2gp_d2786842-7b37-4e0c-843e-9dc4467df6ad/download-cache-openstack-openstack-cell1/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.071521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/dnsmasq-dns/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.304391 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cd58de31-5f82-4acb-8713-397027fbae4f/glance-httpd/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.317546 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cd58de31-5f82-4acb-8713-397027fbae4f/glance-log/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.495714 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1efb870-06f3-40b8-baca-e418a034eaed/glance-httpd/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.532705 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1efb870-06f3-40b8-baca-e418a034eaed/glance-log/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.692421 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-796f99d566-r2p9d_78788c18-3ce2-4e27-841d-e7d380fbab71/heat-api/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.852304 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fffd5d5b8-82pm2_979b34eb-586a-4d86-8e2d-7937614c714a/heat-cfnapi/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.864054 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-595686fb49-hx4rx_9c41cdd1-29dd-4252-b988-1efaeed01573/heat-engine/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.055118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7cfb69fc-bhpx4_f24559d3-3f44-434a-b790-32c52475d532/horizon-log/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.075728 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7cfb69fc-bhpx4_f24559d3-3f44-434a-b790-32c52475d532/horizon/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.091816 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-8t465_125b54e2-cc1e-4a7f-83b6-1474e89bad11/install-certs-openstack-openstack-cell1/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.234517 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-mj45h_dcf02d39-6fe8-40ae-bd31-b7d1a38103b4/install-os-openstack-openstack-cell1/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.367793 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-868b8c986d-gxm79_65bf82e2-5440-45b2-b1ff-1f6998ce46f8/keystone-api/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.694413 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550061-rh8tm_7e397c25-ae37-4c30-83ce-3bdb83f5b9c5/keystone-cron/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.756810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550121-m29ck_8bd4a79d-8d75-4e13-8eee-cc51925ca7fd/keystone-cron/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.822713 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d54c8104-6382-4373-a672-8e2ac804ebba/kube-state-metrics/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.981878 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-rd94l_992d3500-f892-42c6-805f-ae9c96793d0f/libvirt-openstack-openstack-cell1/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.095809 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6cffb553-3b2f-404c-a7da-d481d4635cfc/manila-api-log/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.122201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6cffb553-3b2f-404c-a7da-d481d4635cfc/manila-api/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.190469 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_637ae9d4-1fa5-48e0-87d7-5f6004e0352d/probe/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.252018 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_637ae9d4-1fa5-48e0-87d7-5f6004e0352d/manila-scheduler/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.332664 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c0142342-e857-4238-b442-8e06ceb406e1/manila-share/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.434463 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c0142342-e857-4238-b442-8e06ceb406e1/probe/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.530410 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-qt4fx_b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.602244 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c4b97569-rrjw7_68cdeb73-eb92-4d18-8a9f-a5e3a0a53900/neutron-httpd/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.638641 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c4b97569-rrjw7_68cdeb73-eb92-4d18-8a9f-a5e3a0a53900/neutron-api/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.809843 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-hwhq9_12740b7f-a6a2-45e2-a288-fbb880a2c72b/neutron-metadata-openstack-openstack-cell1/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.863582 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-f6sk8_13f318e2-a78d-497f-bfbc-4c60d9156220/neutron-sriov-openstack-openstack-cell1/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.088243 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_afd37ef2-90bf-4ea4-86a1-2113a005824e/nova-api-api/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.190531 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_afd37ef2-90bf-4ea4-86a1-2113a005824e/nova-api-log/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.346653 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3534e95e-b33c-4294-98d0-f758ea92cf72/nova-cell0-conductor-conductor/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.495201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_390628a6-50b8-491e-bc5d-80a524b67be6/nova-cell1-conductor-conductor/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.708507 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b55b7c8a-8888-43e1-a593-d3a1f00cba4c/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.191881 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx_0ac2d268-855a-485e-a96f-87b5cc0e4f6e/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.309438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-thkw7_aecb4202-1208-4ba5-8515-2ecf99c8c7d1/nova-cell1-openstack-openstack-cell1/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.562680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a2d37f17-3e80-43b1-b6e3-df2316900973/nova-metadata-log/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.573034 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a2d37f17-3e80-43b1-b6e3-df2316900973/nova-metadata-metadata/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.710418 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_199a5a0a-f05c-4e06-9bee-2a5d0303f3a0/nova-scheduler-scheduler/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.773668 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.067853 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/octavia-api-provider-agent/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.079667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.308588 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.319832 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/octavia-api/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.774694 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.846442 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/octavia-healthmanager/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.868700 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.124290 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.218557 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/octavia-housekeeping/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.282323 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.367839 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:28 crc kubenswrapper[4885]: E0308 22:13:28.368106 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.406777 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/octavia-amphora-httpd/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.411884 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.502844 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.724468 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.783170 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/octavia-rsyslog/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.891634 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/init/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.167810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/init/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.231874 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.268622 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/octavia-worker/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.408870 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.416461 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/galera/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.534602 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.803822 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/galera/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.810895 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.814668 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_beb866d8-13cb-4dd6-9ce8-a2dad0935453/openstackclient/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.017183 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5jmft_00348ab8-7686-4e8d-bada-3d9e32edca19/ovn-controller/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.115472 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rzmvz_2db23198-8297-4e77-aed3-78ca89d5e6f8/openstack-network-exporter/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.278803 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server-init/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.530784 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server-init/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.531207 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.534754 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovs-vswitchd/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.712564 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e81fc01-0a65-4956-9ba5-26ec5f7c25c9/ovn-northd/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.746438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e81fc01-0a65-4956-9ba5-26ec5f7c25c9/openstack-network-exporter/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.887150 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-mz9gg_ba3efa94-310a-4c53-ac95-2444759b8574/ovn-openstack-openstack-cell1/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.989908 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ebd9461e-0196-4eaf-a733-44340b19d354/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.061110 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ebd9461e-0196-4eaf-a733-44340b19d354/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.203183 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d605915b-24f4-45ec-bb13-7e7097bb288b/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.263475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d605915b-24f4-45ec-bb13-7e7097bb288b/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.398155 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_4eca16e2-6962-4cad-9cbb-23d33af9c10a/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.413167 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_4eca16e2-6962-4cad-9cbb-23d33af9c10a/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.546886 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c0a1292-7594-49d3-b3f0-2e1a6aa004e2/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.630900 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c0a1292-7594-49d3-b3f0-2e1a6aa004e2/ovsdbserver-sb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.801431 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1cdde225-3478-4566-9019-df846ce962fb/ovsdbserver-sb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.805810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1cdde225-3478-4566-9019-df846ce962fb/openstack-network-exporter/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.050901 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_11852e05-e4cd-4884-b382-035694906263/ovsdbserver-sb/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.068793 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_11852e05-e4cd-4884-b382-035694906263/openstack-network-exporter/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.574979 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt_87894214-b974-4fc7-b23d-d739fde2466f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.678786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bc985567d-hcdbz_0741bee5-7932-4af4-a8c1-1e56b754e359/placement-api/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.740156 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bc985567d-hcdbz_0741bee5-7932-4af4-a8c1-1e56b754e359/placement-log/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.920488 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/init-config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.056496 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/init-config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.165838 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.166032 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/thanos-sidecar/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.181625 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/prometheus/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.378298 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.226201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c0cb204d-b6bd-417e-9b6f-6a0c7faf4820/memcached/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.251421 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/rabbitmq/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.266433 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.304588 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.494529 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-k564w_cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb/reboot-os-openstack-openstack-cell1/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.544104 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/rabbitmq/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.566981 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.755222 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fg8lj_53bb70ab-feea-49a2-9850-fc72a2e0f650/run-os-openstack-openstack-cell1/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.843505 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-khtfv_bef3d518-c413-4129-b022-dffb097239b2/ssh-known-hosts-openstack/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.984480 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-9mchk_5583daa6-0c35-4fde-8580-2a4d7ccbfb17/telemetry-openstack-openstack-cell1/0.log" Mar 08 22:13:35 crc kubenswrapper[4885]: I0308 22:13:35.152455 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6_8be575f8-a741-4b5a-b7fa-c43e5dd65598/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 08 22:13:35 crc kubenswrapper[4885]: I0308 22:13:35.177483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-lcz5l_df77d68a-3570-49fb-958b-c358543e661f/validate-network-openstack-openstack-cell1/0.log" Mar 08 22:13:43 crc kubenswrapper[4885]: I0308 22:13:43.373428 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:43 crc kubenswrapper[4885]: E0308 22:13:43.374204 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:55 crc kubenswrapper[4885]: I0308 22:13:55.367908 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:55 crc kubenswrapper[4885]: E0308 22:13:55.368506 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.147361 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:00 crc kubenswrapper[4885]: E0308 22:14:00.148490 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.148507 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.148791 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.149806 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.151889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.151998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.152138 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.165483 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.278863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.381331 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.400749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.469459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.024486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.181213 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.409444 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.432111 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.464244 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.618972 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.620188 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.648155 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/extract/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.754402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerStarted","Data":"ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef"} Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.142438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-sbrjr_45c29030-0945-4655-b035-d75e8bf0f818/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.604593 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-4hstb_69dc5eb7-1c2e-4fbb-a220-2129df60ffb3/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.699529 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-n88vz_d5770638-6059-4ce5-b401-84b0155589a3/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.789526 4885 generic.go:334] "Generic (PLEG): container finished" podID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerID="a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c" exitCode=0 Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.789576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerDied","Data":"a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c"} Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.992166 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-xplpw_4742ab81-6c6d-43c8-8025-6a656b8c40dc/manager/0.log" Mar 08 22:14:03 crc kubenswrapper[4885]: I0308 22:14:03.542835 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-nclkr_7180efa7-8d93-436e-8de2-78fe5c173843/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.306217 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-hlpjf_157555d5-ca64-49f8-8849-cd763c83feda/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.358627 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vf24d_9fc40f07-4706-4008-b86e-e73a2f2ab620/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.428246 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-rplg5_92716f38-db4c-41d9-962d-f3cc2669a7fb/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.467510 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.585639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.599172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b" (OuterVolumeSpecName: "kube-api-access-sqq2b") pod "2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" (UID: "2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3"). InnerVolumeSpecName "kube-api-access-sqq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.687513 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") on node \"crc\" DevicePath \"\"" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.740118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-q5hfb_27aa3877-54cd-414d-80a0-ab20a68ed535/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerDied","Data":"ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef"} Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808608 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.970302 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-2hsgc_8f363429-f2b7-468c-b74b-ef14ebfab90e/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.130874 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-p8r6f_392750e0-9d71-418d-89b0-ec10f33ec505/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.480501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-7vtx7_7c05f3ed-fe8f-47db-b596-8b90b96c295c/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.576730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.593042 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.693937 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-k4r6w_bbb8966a-e61f-427d-af2a-0fdab2348d03/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.748657 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-f9jr4_5f89ecdd-60c3-4da6-b185-1f044d8ffc46/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.759246 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7_d8de7df0-2dea-4d3c-a02e-57bfabade82f/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.124252 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-l5vj4_9acb4d66-3a49-42b7-bd78-4d904f080c50/operator/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.210129 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w4b99_024a1da8-dfa6-4cdc-a5ec-12b9ce56969a/registry-server/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.529946 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-4gfw2_8d086566-6154-4ddd-8028-a9c203cfec11/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.651072 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-wdrfh_44fbac8d-d81f-4c03-9555-ef33551d478d/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.786991 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pd9b2_a8caa87f-832f-4436-beaa-aaa505de3bac/operator/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.884912 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-7hgld_d9580392-741e-406b-b72d-91aa945f65c2/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.147050 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-xf4hm_ea5acc0f-2ad8-46d5-80a2-502e2900fdd6/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.177712 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-7mghs_c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.352401 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-66zgf_d5136d34-82a8-47c5-9d7d-09e0206587e8/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.384878 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16108583-f398-4571-9e1c-41d86a071331" path="/var/lib/kubelet/pods/16108583-f398-4571-9e1c-41d86a071331/volumes" Mar 08 22:14:08 crc kubenswrapper[4885]: I0308 22:14:08.018918 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-pzg95_deedb14e-007e-44eb-bd52-85bbc12d0bec/manager/0.log" Mar 08 22:14:09 crc kubenswrapper[4885]: I0308 22:14:09.377118 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:09 crc kubenswrapper[4885]: E0308 22:14:09.377371 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:15 crc kubenswrapper[4885]: I0308 22:14:15.846872 4885 scope.go:117] "RemoveContainer" containerID="359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c" Mar 08 22:14:20 crc kubenswrapper[4885]: I0308 22:14:20.368266 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:20 crc kubenswrapper[4885]: E0308 22:14:20.371339 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:29 crc kubenswrapper[4885]: I0308 22:14:29.974064 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k2rwt_fe3a8c81-8c1d-4b38-9cae-813fb749fd43/control-plane-machine-set-operator/0.log" Mar 08 22:14:30 crc kubenswrapper[4885]: I0308 22:14:30.169160 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cpx85_175c50f5-857d-4697-bcde-2ce47f2edfc5/kube-rbac-proxy/0.log" Mar 08 22:14:30 crc kubenswrapper[4885]: I0308 22:14:30.236864 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cpx85_175c50f5-857d-4697-bcde-2ce47f2edfc5/machine-api-operator/0.log" Mar 08 22:14:31 crc kubenswrapper[4885]: I0308 22:14:31.368240 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:31 crc kubenswrapper[4885]: E0308 22:14:31.368939 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:43 crc kubenswrapper[4885]: I0308 22:14:43.368360 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:43 crc kubenswrapper[4885]: E0308 22:14:43.369768 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:46 crc kubenswrapper[4885]: I0308 22:14:46.757373 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-8wbq2_6da97aa0-4c69-414f-8fda-23403d2346e5/cert-manager-controller/0.log" Mar 08 22:14:46 crc kubenswrapper[4885]: I0308 22:14:46.915075 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-fbnvg_d62feb91-9474-41c0-b79c-93f3f6dd830b/cert-manager-cainjector/0.log" Mar 08 22:14:47 crc kubenswrapper[4885]: I0308 22:14:47.021734 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-kgm5k_de1b5c94-7518-46c5-af4a-2b692d23b3b7/cert-manager-webhook/0.log" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.314840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:51 crc kubenswrapper[4885]: E0308 22:14:51.316002 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.316122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.316422 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.318497 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.335187 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.527043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.560183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.650414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:52 crc kubenswrapper[4885]: I0308 22:14:52.157446 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:52 crc kubenswrapper[4885]: I0308 22:14:52.307793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"410526b2d3e9b4d6a551a2823b5579cf7d0ec2f78e2d4b1b12bd289e95ff9e5f"} Mar 08 22:14:53 crc kubenswrapper[4885]: I0308 22:14:53.322657 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" exitCode=0 Mar 08 22:14:53 crc kubenswrapper[4885]: I0308 22:14:53.322738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d"} Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.504694 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.508151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.519485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606013 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.707933 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.730839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.833382 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.317898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:55 crc kubenswrapper[4885]: W0308 22:14:55.320190 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f49bd2_97c7_4446_9814_5c5788b65342.slice/crio-cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e WatchSource:0}: Error finding container cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e: Status 404 returned error can't find the container with id cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.346642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e"} Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.348702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.366155 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" exitCode=0 Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.366429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515"} Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.368022 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:56 crc kubenswrapper[4885]: E0308 22:14:56.370513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:58 crc kubenswrapper[4885]: I0308 22:14:58.389650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.160241 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.162374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.164917 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.164910 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.189572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358495 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.360336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.384345 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.390831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.423623 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" exitCode=0 Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.423700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.425639 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" exitCode=0 Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.425672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.497837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.981512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.448582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.451903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.453530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerStarted","Data":"469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.453586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerStarted","Data":"bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.477982 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxvs5" podStartSLOduration=3.883050215 podStartE2EDuration="11.477952851s" podCreationTimestamp="2026-03-08 22:14:51 +0000 UTC" firstStartedPulling="2026-03-08 22:14:53.325448293 +0000 UTC m=+9794.721502316" lastFinishedPulling="2026-03-08 22:15:00.920350929 +0000 UTC m=+9802.316404952" observedRunningTime="2026-03-08 22:15:02.472252198 +0000 UTC m=+9803.868306221" watchObservedRunningTime="2026-03-08 22:15:02.477952851 +0000 UTC m=+9803.874006894" Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.502454 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9xscz" podStartSLOduration=3.894324767 podStartE2EDuration="8.502438225s" podCreationTimestamp="2026-03-08 22:14:54 +0000 UTC" firstStartedPulling="2026-03-08 22:14:56.369681994 +0000 UTC m=+9797.765736027" lastFinishedPulling="2026-03-08 22:15:00.977795442 +0000 UTC m=+9802.373849485" observedRunningTime="2026-03-08 22:15:02.492736456 +0000 UTC m=+9803.888790479" watchObservedRunningTime="2026-03-08 22:15:02.502438225 +0000 UTC m=+9803.898492238" Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.524582 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" podStartSLOduration=2.5245670049999998 podStartE2EDuration="2.524567005s" podCreationTimestamp="2026-03-08 22:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 22:15:02.51539745 +0000 UTC m=+9803.911451473" watchObservedRunningTime="2026-03-08 22:15:02.524567005 +0000 UTC m=+9803.920621028" Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.464564 4885 generic.go:334] "Generic (PLEG): container finished" podID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerID="469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe" exitCode=0 Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.464757 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerDied","Data":"469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe"} Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.947953 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-ftcgc_c548cbba-61a5-4167-b494-f57c45b1599b/nmstate-console-plugin/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.138863 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6m2b5_74d96fe5-1ab9-4703-8717-509cf115d985/nmstate-handler/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.199030 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wsk7q_75f588d1-7159-4a94-bf89-bb18a880a403/kube-rbac-proxy/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.387103 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wsk7q_75f588d1-7159-4a94-bf89-bb18a880a403/nmstate-metrics/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.388982 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-5twjk_02d2b43e-55f4-49f1-9bb1-3e70ed22a3da/nmstate-operator/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.619279 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-bb7k9_3793d26a-a132-40db-b8fe-2cf83428b03c/nmstate-webhook/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.834430 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.834491 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.898216 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.957651 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.962941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm" (OuterVolumeSpecName: "kube-api-access-8g8wm") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "kube-api-access-8g8wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.976191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057734 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057764 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057775 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerDied","Data":"bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428"} Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485865 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485954 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.892497 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9xscz" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:05 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:05 crc kubenswrapper[4885]: > Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.988616 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 22:15:06 crc kubenswrapper[4885]: I0308 22:15:06.003505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 22:15:07 crc kubenswrapper[4885]: I0308 22:15:07.383072 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" path="/var/lib/kubelet/pods/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba/volumes" Mar 08 22:15:09 crc kubenswrapper[4885]: I0308 22:15:09.375991 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:09 crc kubenswrapper[4885]: E0308 22:15:09.376559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:11 crc kubenswrapper[4885]: I0308 22:15:11.651336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:11 crc kubenswrapper[4885]: I0308 22:15:11.651722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:12 crc kubenswrapper[4885]: I0308 22:15:12.722526 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:12 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:12 crc kubenswrapper[4885]: > Mar 08 22:15:14 crc kubenswrapper[4885]: I0308 22:15:14.896781 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:14 crc kubenswrapper[4885]: I0308 22:15:14.971720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:16 crc kubenswrapper[4885]: I0308 22:15:16.127045 4885 scope.go:117] "RemoveContainer" containerID="8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb" Mar 08 22:15:17 crc kubenswrapper[4885]: I0308 22:15:17.811232 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:17 crc kubenswrapper[4885]: I0308 22:15:17.812035 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9xscz" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" containerID="cri-o://39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" gracePeriod=2 Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.305785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351288 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.353158 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities" (OuterVolumeSpecName: "utilities") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.372632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng" (OuterVolumeSpecName: "kube-api-access-x4xng") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "kube-api-access-x4xng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.387216 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454278 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454312 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454322 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637594 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" exitCode=0 Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.638178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e"} Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.638231 4885 scope.go:117] "RemoveContainer" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.669389 4885 scope.go:117] "RemoveContainer" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.680454 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.690256 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.697805 4885 scope.go:117] "RemoveContainer" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.771742 4885 scope.go:117] "RemoveContainer" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.772268 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": container with ID starting with 39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d not found: ID does not exist" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772338 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} err="failed to get container status \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": rpc error: code = NotFound desc = could not find container \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": container with ID starting with 39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d not found: ID does not exist" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772377 4885 scope.go:117] "RemoveContainer" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.772766 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": container with ID starting with 057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba not found: ID does not exist" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772812 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} err="failed to get container status \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": rpc error: code = NotFound desc = could not find container \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": container with ID starting with 057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba not found: ID does not exist" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772841 4885 scope.go:117] "RemoveContainer" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.773189 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": container with ID starting with b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515 not found: ID does not exist" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.773233 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515"} err="failed to get container status \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": rpc error: code = NotFound desc = could not find container \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": container with ID starting with b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515 not found: ID does not exist" Mar 08 22:15:19 crc kubenswrapper[4885]: I0308 22:15:19.383990 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" path="/var/lib/kubelet/pods/19f49bd2-97c7-4446-9814-5c5788b65342/volumes" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.573390 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-brf5z_c9864aac-5821-4f9b-bcc8-f07752f987b7/prometheus-operator/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.624938 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429_65ea3078-ccec-4913-9ce0-873ad93efd0e/prometheus-operator-admission-webhook/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.774247 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7_0fe4d43f-e037-431e-98e3-d50194963def/prometheus-operator-admission-webhook/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.828145 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qfwg5_482d7874-16e6-4043-95b1-59222dab9edc/operator/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.955077 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-m8k65_062a5ba6-b2c8-4b0c-95e1-d51c1196f367/perses-operator/0.log" Mar 08 22:15:22 crc kubenswrapper[4885]: I0308 22:15:22.700887 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:22 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:22 crc kubenswrapper[4885]: > Mar 08 22:15:23 crc kubenswrapper[4885]: I0308 22:15:23.368478 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:23 crc kubenswrapper[4885]: E0308 22:15:23.369103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.038246 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039320 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-utilities" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039334 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-utilities" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039354 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039360 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039375 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039381 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039400 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-content" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039406 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-content" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039654 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039665 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.041606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.052129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.158935 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.159232 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.159579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.261966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.282067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.366774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.895780 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771246 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="6609edb3e701aaab6662d1f5509505324f944e8ead3a92d45ce10f8e8a141f42" exitCode=0 Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"6609edb3e701aaab6662d1f5509505324f944e8ead3a92d45ce10f8e8a141f42"} Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc"} Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.776180 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:15:33 crc kubenswrapper[4885]: I0308 22:15:33.433699 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:33 crc kubenswrapper[4885]: > Mar 08 22:15:33 crc kubenswrapper[4885]: I0308 22:15:33.791871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a"} Mar 08 22:15:34 crc kubenswrapper[4885]: I0308 22:15:34.803114 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a" exitCode=0 Mar 08 22:15:34 crc kubenswrapper[4885]: I0308 22:15:34.803231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a"} Mar 08 22:15:35 crc kubenswrapper[4885]: I0308 22:15:35.825292 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce"} Mar 08 22:15:35 crc kubenswrapper[4885]: I0308 22:15:35.857313 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fdhwr" podStartSLOduration=2.405334161 podStartE2EDuration="5.857289303s" podCreationTimestamp="2026-03-08 22:15:30 +0000 UTC" firstStartedPulling="2026-03-08 22:15:31.775724308 +0000 UTC m=+9833.171778371" lastFinishedPulling="2026-03-08 22:15:35.22767948 +0000 UTC m=+9836.623733513" observedRunningTime="2026-03-08 22:15:35.84403901 +0000 UTC m=+9837.240093063" watchObservedRunningTime="2026-03-08 22:15:35.857289303 +0000 UTC m=+9837.253343336" Mar 08 22:15:38 crc kubenswrapper[4885]: I0308 22:15:38.368854 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:38 crc kubenswrapper[4885]: E0308 22:15:38.370498 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.366910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.367413 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.432409 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.498890 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-xj2vs_6d11a8df-ce5d-404a-b827-822101b061c8/kube-rbac-proxy/0.log" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.702456 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.914753 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.991476 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.006156 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-xj2vs_6d11a8df-ce5d-404a-b827-822101b061c8/controller/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.029188 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.037514 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.173065 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.329401 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.329859 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.358195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.392735 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.540911 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.546501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.552082 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.606506 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/controller/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.713314 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.765098 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/frr-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.766702 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.811462 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/kube-rbac-proxy/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.833766 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/kube-rbac-proxy-frr/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.068743 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/reloader/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.087786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-cq6xg_e76b0259-0d11-4451-b770-4ca5611ce32e/frr-k8s-webhook-server/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.929949 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6bc5657994-v7mn9_ca51bb10-b38d-4e58-9d29-6c6b8922f72e/manager/0.log" Mar 08 22:15:43 crc kubenswrapper[4885]: I0308 22:15:43.125890 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74675b5ddf-p7c2j_6ea4545b-278f-43ff-be3c-fc1346b591a1/webhook-server/0.log" Mar 08 22:15:43 crc kubenswrapper[4885]: I0308 22:15:43.179087 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5nclk_860f2bc3-9bd4-43c5-9400-67293a877c6f/kube-rbac-proxy/0.log" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.220928 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5nclk_860f2bc3-9bd4-43c5-9400-67293a877c6f/speaker/0.log" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.416176 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.416435 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fdhwr" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" containerID="cri-o://576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce" gracePeriod=2 Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908572 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce" exitCode=0 Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce"} Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc"} Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908964 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.955263 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.982668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.982970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.983079 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.985478 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities" (OuterVolumeSpecName: "utilities") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.996799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx" (OuterVolumeSpecName: "kube-api-access-znggx") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "kube-api-access-znggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.048966 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085452 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085504 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085520 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.794102 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/frr/0.log" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.813844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.814109 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" containerID="cri-o://6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" gracePeriod=2 Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.918055 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.006846 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.020775 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.405035 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513457 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513548 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.514650 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities" (OuterVolumeSpecName: "utilities") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.522481 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq" (OuterVolumeSpecName: "kube-api-access-5ntsq") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "kube-api-access-5ntsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.616357 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.616406 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.646013 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.718546 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937856 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" exitCode=0 Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937945 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937951 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.938306 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"410526b2d3e9b4d6a551a2823b5579cf7d0ec2f78e2d4b1b12bd289e95ff9e5f"} Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.938333 4885 scope.go:117] "RemoveContainer" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.978606 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.979370 4885 scope.go:117] "RemoveContainer" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.991045 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.021115 4885 scope.go:117] "RemoveContainer" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075190 4885 scope.go:117] "RemoveContainer" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.075777 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": container with ID starting with 6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6 not found: ID does not exist" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075810 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} err="failed to get container status \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": rpc error: code = NotFound desc = could not find container \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": container with ID starting with 6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6 not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075833 4885 scope.go:117] "RemoveContainer" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.076476 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": container with ID starting with 360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728 not found: ID does not exist" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076533 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} err="failed to get container status \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": rpc error: code = NotFound desc = could not find container \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": container with ID starting with 360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728 not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076566 4885 scope.go:117] "RemoveContainer" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.076872 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": container with ID starting with e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d not found: ID does not exist" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076936 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d"} err="failed to get container status \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": rpc error: code = NotFound desc = could not find container \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": container with ID starting with e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.381557 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52401772-10fd-464c-bb40-dceaaca564db" path="/var/lib/kubelet/pods/52401772-10fd-464c-bb40-dceaaca564db/volumes" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.383497 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" path="/var/lib/kubelet/pods/d913a458-5b1d-491c-bfdc-d2a07f571ce8/volumes" Mar 08 22:15:49 crc kubenswrapper[4885]: I0308 22:15:49.376377 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:49 crc kubenswrapper[4885]: E0308 22:15:49.377321 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.108266 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.269581 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.303299 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.334438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.512195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.550301 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.567273 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/extract/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.702841 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.949164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.951722 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.958554 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.155820 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.170751 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.187226 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/extract/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.364984 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.536605 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.541324 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.559287 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.756718 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/extract/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.788267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.815396 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.979379 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.140807 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141222 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141240 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141251 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141258 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141272 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141280 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141310 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141328 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141334 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141349 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141559 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.142271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.144261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.144378 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.150289 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.158111 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.180875 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.210531 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.241472 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.248426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.350082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.403073 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.425287 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.437863 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.458099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.640441 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.897407 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.960769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.970270 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.001506 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.104166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerStarted","Data":"cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628"} Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.398327 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.410457 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.566626 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/registry-server/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.736485 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.858348 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.941844 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.024195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.277412 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.282299 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.292501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/extract/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.532680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2774l_1e87323f-cf50-46ef-8e7c-cccd8a1e3601/marketplace-operator/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.563424 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.728975 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/registry-server/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.741288 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.751988 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.780951 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.133178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerStarted","Data":"44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01"} Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.153866 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550136-4kln6" podStartSLOduration=2.018463094 podStartE2EDuration="3.153839367s" podCreationTimestamp="2026-03-08 22:16:00 +0000 UTC" firstStartedPulling="2026-03-08 22:16:00.966892748 +0000 UTC m=+9862.362946771" lastFinishedPulling="2026-03-08 22:16:02.102269021 +0000 UTC m=+9863.498323044" observedRunningTime="2026-03-08 22:16:03.144447236 +0000 UTC m=+9864.540501259" watchObservedRunningTime="2026-03-08 22:16:03.153839367 +0000 UTC m=+9864.549893430" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.620595 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.646426 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.646465 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.958256 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.958289 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/registry-server/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.973286 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.974739 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144523 4885 generic.go:334] "Generic (PLEG): container finished" podID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerID="44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01" exitCode=0 Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144564 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerDied","Data":"44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01"} Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144832 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.211275 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.367747 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:04 crc kubenswrapper[4885]: E0308 22:16:04.368130 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.206025 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/registry-server/0.log" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.753197 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.909223 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.918304 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6" (OuterVolumeSpecName: "kube-api-access-58tw6") pod "89e80cc7-fce3-4c3c-9c10-b76e212f51e0" (UID: "89e80cc7-fce3-4c3c-9c10-b76e212f51e0"). InnerVolumeSpecName "kube-api-access-58tw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.012294 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerDied","Data":"cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628"} Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191727 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191815 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.220677 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.229705 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:16:07 crc kubenswrapper[4885]: I0308 22:16:07.377948 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" path="/var/lib/kubelet/pods/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6/volumes" Mar 08 22:16:16 crc kubenswrapper[4885]: I0308 22:16:16.190387 4885 scope.go:117] "RemoveContainer" containerID="1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.377708 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:19 crc kubenswrapper[4885]: E0308 22:16:19.379598 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.666200 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-brf5z_c9864aac-5821-4f9b-bcc8-f07752f987b7/prometheus-operator/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.719485 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429_65ea3078-ccec-4913-9ce0-873ad93efd0e/prometheus-operator-admission-webhook/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.733665 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7_0fe4d43f-e037-431e-98e3-d50194963def/prometheus-operator-admission-webhook/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.855511 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qfwg5_482d7874-16e6-4043-95b1-59222dab9edc/operator/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.919726 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-m8k65_062a5ba6-b2c8-4b0c-95e1-d51c1196f367/perses-operator/0.log" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.922877 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:29 crc kubenswrapper[4885]: E0308 22:16:29.923742 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.923754 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.929937 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.931695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.937543 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051891 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.153591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.153670 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154341 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154636 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.177796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.253269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.871467 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.368229 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:31 crc kubenswrapper[4885]: E0308 22:16:31.368946 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.488771 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" exitCode=0 Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.488954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5"} Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.489286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"d11f09a1fc366a8d58eedcdbcbd46f3f35e1bad13a8028153e55cdffe1f6d801"} Mar 08 22:16:33 crc kubenswrapper[4885]: I0308 22:16:33.510118 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} Mar 08 22:16:35 crc kubenswrapper[4885]: I0308 22:16:35.538601 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" exitCode=0 Mar 08 22:16:35 crc kubenswrapper[4885]: I0308 22:16:35.538792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} Mar 08 22:16:36 crc kubenswrapper[4885]: I0308 22:16:36.551349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} Mar 08 22:16:36 crc kubenswrapper[4885]: I0308 22:16:36.567929 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cmbcr" podStartSLOduration=3.069929826 podStartE2EDuration="7.567900105s" podCreationTimestamp="2026-03-08 22:16:29 +0000 UTC" firstStartedPulling="2026-03-08 22:16:31.490884002 +0000 UTC m=+9892.886938015" lastFinishedPulling="2026-03-08 22:16:35.988854281 +0000 UTC m=+9897.384908294" observedRunningTime="2026-03-08 22:16:36.564905706 +0000 UTC m=+9897.960959729" watchObservedRunningTime="2026-03-08 22:16:36.567900105 +0000 UTC m=+9897.963954128" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.254140 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.254771 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.313402 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:46 crc kubenswrapper[4885]: I0308 22:16:46.368005 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:46 crc kubenswrapper[4885]: E0308 22:16:46.368645 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.308419 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.375409 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.717982 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cmbcr" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" containerID="cri-o://188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" gracePeriod=2 Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.290751 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489311 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489412 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.491767 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities" (OuterVolumeSpecName: "utilities") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.495817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr" (OuterVolumeSpecName: "kube-api-access-gl7dr") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "kube-api-access-gl7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.569263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594060 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594385 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594405 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.733898 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" exitCode=0 Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.733964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d11f09a1fc366a8d58eedcdbcbd46f3f35e1bad13a8028153e55cdffe1f6d801"} Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734031 4885 scope.go:117] "RemoveContainer" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734061 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.782671 4885 scope.go:117] "RemoveContainer" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.813429 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.823542 4885 scope.go:117] "RemoveContainer" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.825759 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.874727 4885 scope.go:117] "RemoveContainer" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875165 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": container with ID starting with 188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b not found: ID does not exist" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875205 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} err="failed to get container status \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": rpc error: code = NotFound desc = could not find container \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": container with ID starting with 188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b not found: ID does not exist" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875228 4885 scope.go:117] "RemoveContainer" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875559 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": container with ID starting with d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9 not found: ID does not exist" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875588 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} err="failed to get container status \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": rpc error: code = NotFound desc = could not find container \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": container with ID starting with d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9 not found: ID does not exist" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875606 4885 scope.go:117] "RemoveContainer" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875856 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": container with ID starting with d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5 not found: ID does not exist" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875900 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5"} err="failed to get container status \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": rpc error: code = NotFound desc = could not find container \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": container with ID starting with d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5 not found: ID does not exist" Mar 08 22:16:53 crc kubenswrapper[4885]: I0308 22:16:53.381773 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" path="/var/lib/kubelet/pods/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15/volumes" Mar 08 22:16:58 crc kubenswrapper[4885]: I0308 22:16:58.369191 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:58 crc kubenswrapper[4885]: E0308 22:16:58.370219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:13 crc kubenswrapper[4885]: I0308 22:17:13.372194 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:13 crc kubenswrapper[4885]: E0308 22:17:13.374968 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:26 crc kubenswrapper[4885]: I0308 22:17:26.368510 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:26 crc kubenswrapper[4885]: E0308 22:17:26.369302 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:39 crc kubenswrapper[4885]: I0308 22:17:39.405894 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:39 crc kubenswrapper[4885]: E0308 22:17:39.406985 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:50 crc kubenswrapper[4885]: I0308 22:17:50.368079 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:50 crc kubenswrapper[4885]: E0308 22:17:50.368998 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.167393 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168662 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-utilities" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-utilities" Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168703 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168714 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168742 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-content" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168751 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-content" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.169031 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.170044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.172411 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.172839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.178840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.201895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.333847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.437256 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.464407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.502634 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.964602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:01 crc kubenswrapper[4885]: I0308 22:18:01.654318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerStarted","Data":"57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca"} Mar 08 22:18:02 crc kubenswrapper[4885]: I0308 22:18:02.668087 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerID="cb5e22f23aa19fc5c02b09e4507e9cf89fc1959db4ad0aff6bb1d9208face06e" exitCode=0 Mar 08 22:18:02 crc kubenswrapper[4885]: I0308 22:18:02.668188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerDied","Data":"cb5e22f23aa19fc5c02b09e4507e9cf89fc1959db4ad0aff6bb1d9208face06e"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.104291 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.228834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"aa83b743-7733-47dc-856b-bcc08f1f6571\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.234745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k" (OuterVolumeSpecName: "kube-api-access-qvp5k") pod "aa83b743-7733-47dc-856b-bcc08f1f6571" (UID: "aa83b743-7733-47dc-856b-bcc08f1f6571"). InnerVolumeSpecName "kube-api-access-qvp5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.332500 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.369448 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.710357 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerDied","Data":"57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726488 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.207768 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.217176 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.391817 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5350f846-ee1f-400b-8579-de1a56050f02" path="/var/lib/kubelet/pods/5350f846-ee1f-400b-8579-de1a56050f02/volumes" Mar 08 22:18:16 crc kubenswrapper[4885]: I0308 22:18:16.717150 4885 scope.go:117] "RemoveContainer" containerID="381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2" Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.067426 4885 generic.go:334] "Generic (PLEG): container finished" podID="009e478e-8f33-43d1-aded-7d3084ed486e" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" exitCode=0 Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.067510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerDied","Data":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.071816 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.575798 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/gather/0.log" Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.685307 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.686206 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-58ck9/must-gather-x7xtd" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" containerID="cri-o://d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" gracePeriod=2 Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.705354 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.163416 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/copy/0.log" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.164301 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.171716 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/copy/0.log" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172186 4885 generic.go:334] "Generic (PLEG): container finished" podID="009e478e-8f33-43d1-aded-7d3084ed486e" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" exitCode=143 Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172259 4885 scope.go:117] "RemoveContainer" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.208711 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.240998 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"009e478e-8f33-43d1-aded-7d3084ed486e\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.241246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"009e478e-8f33-43d1-aded-7d3084ed486e\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.247542 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg" (OuterVolumeSpecName: "kube-api-access-bsvfg") pod "009e478e-8f33-43d1-aded-7d3084ed486e" (UID: "009e478e-8f33-43d1-aded-7d3084ed486e"). InnerVolumeSpecName "kube-api-access-bsvfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.272376 4885 scope.go:117] "RemoveContainer" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: E0308 22:18:40.273058 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": container with ID starting with d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916 not found: ID does not exist" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273095 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916"} err="failed to get container status \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": rpc error: code = NotFound desc = could not find container \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": container with ID starting with d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916 not found: ID does not exist" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273118 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: E0308 22:18:40.273350 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": container with ID starting with 995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b not found: ID does not exist" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273369 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} err="failed to get container status \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": rpc error: code = NotFound desc = could not find container \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": container with ID starting with 995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b not found: ID does not exist" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.345006 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.481726 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "009e478e-8f33-43d1-aded-7d3084ed486e" (UID: "009e478e-8f33-43d1-aded-7d3084ed486e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.552246 4885 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:41 crc kubenswrapper[4885]: I0308 22:18:41.382837 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" path="/var/lib/kubelet/pods/009e478e-8f33-43d1-aded-7d3084ed486e/volumes" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.168231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169268 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169283 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169309 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169350 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169532 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169553 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169573 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.170353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.172060 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.172659 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.177318 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.186864 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.264881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.367000 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.388072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.502361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:01 crc kubenswrapper[4885]: I0308 22:20:01.038194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:01 crc kubenswrapper[4885]: I0308 22:20:01.252264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerStarted","Data":"2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324"} Mar 08 22:20:04 crc kubenswrapper[4885]: I0308 22:20:04.294572 4885 generic.go:334] "Generic (PLEG): container finished" podID="374e97b8-fd74-4ec5-95c5-461c1fef2762" containerID="80ee712226011463294cc0687d7d3c3e9983afaae20269b7e2ebf527c6e78d3a" exitCode=0 Mar 08 22:20:04 crc kubenswrapper[4885]: I0308 22:20:04.294775 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerDied","Data":"80ee712226011463294cc0687d7d3c3e9983afaae20269b7e2ebf527c6e78d3a"} Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.507801 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.647709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"374e97b8-fd74-4ec5-95c5-461c1fef2762\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.655156 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f" (OuterVolumeSpecName: "kube-api-access-nsn9f") pod "374e97b8-fd74-4ec5-95c5-461c1fef2762" (UID: "374e97b8-fd74-4ec5-95c5-461c1fef2762"). InnerVolumeSpecName "kube-api-access-nsn9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.751527 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") on node \"crc\" DevicePath \"\"" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.343778 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerDied","Data":"2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324"} Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.344221 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.343860 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.593829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.605077 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:20:09 crc kubenswrapper[4885]: I0308 22:20:09.394733 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" path="/var/lib/kubelet/pods/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3/volumes" Mar 08 22:20:16 crc kubenswrapper[4885]: I0308 22:20:16.855067 4885 scope.go:117] "RemoveContainer" containerID="a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c" Mar 08 22:20:32 crc kubenswrapper[4885]: I0308 22:20:32.818694 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:20:32 crc kubenswrapper[4885]: I0308 22:20:32.819512 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:02 crc kubenswrapper[4885]: I0308 22:21:02.818687 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:21:02 crc kubenswrapper[4885]: I0308 22:21:02.819520 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.818681 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.821480 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.821562 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.822888 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.823066 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b" gracePeriod=600 Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.485494 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b" exitCode=0 Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"6f5dc3e4a7511f98072fc30a5b7227ccfc50f91aa2df95d90ebe92d91fc4d514"} Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486322 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153373204024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153373204017365 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153346701016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153346702015463 5ustar corecore